Sunday, October 29, 2006

Baroque Cycle and Change

BORDER=0Since I've already marked myself out as a hopeless nerd (see Free Software), I may as well also admit that I love science fiction. I've just finished Neal Stephenson's Baroque Cycle trilogy. It's not for the faint of heart. Each volume -- QuicksilverBORDER=0, The ConfusionBORDER=0 and The System of the WorldBORDER=0 -- is about 900 pages and has a cast of characters and scope that make War and Peace look like a one-act play. But it is also a good illustration of what I find so fascinating about Sci-Fi, stripped of the irrelevant and distracting bits. The genre isn't primarily about the future. It's about change.

Unlike most of Stephenson's earlier works, the Cycle is set in the past, during the 17th and early 18th century, when the world was moving from one dominated by religion and aristocracy to one driven by science and economics. The characters include most of the prominent members of the Royal Society (Isaac Newton, Robert Boyle, Robert Hooke, etc.), Louis XIV of France, William I of Orange and James II of England. The fictional ones include vagabond Jack "Half-Cocked" Shaftoe, Eliza, a former harem slave turned duchess, and a smart but underachieving member of the Royal Society named Daniel Waterhouse. The action moves through India, North and South America, Europe and North Africa. A reviewer for the Telegraph put it best: you don't just read the Baroque Cycle, you move in and raise a family.

Like all good science fiction, the tension in Stephenson's stories comes not from prediction, but from responding to dramatic or catastrophic shifts in perception. That's why Jules Verne or H.G. Wells novels remain good reads, even if little of what they say have or will occur. It's about contemporary humans confronting a new world, one where the old rules no longer apply, and they must adapt.

The Cycle is about a time when almost all modern concepts in science, such as physics, computing and cryptography, first crop up, even if the inventors didn't yet have the resources (i.e., engines) to put all of them into practical use. At the same time, politics was being transformed by the rising power of the "Mobb," the common man, and nations were beginning to grasp the power of currencies and international trade. Battles were still fought with guns and swords, and obviously will continue to be, but economic warfare was already becoming crucial. A smart investor with some cash could bring down a government, then as now. Those several decades were the historical equivalent of a perfect storm.

Half the fun of reading such a novel is to put myself in some of the characters' places and imagine how I would respond, whether I would recognize the changes in time, or know how to take advantage of them. With Stephenson's story, we know what happens next. It took another 200 years before economists understood fully that money is a confidence trick. It isn't based on gold, but on the belief that a nation can generate the wealth to back it up. And we're still grappling with the implications of our understanding of the universe, and our relatively minor role in it.

We're also living in interesting times now: We have all of the world's knowledge at our fingertips. Anyone can write, film or record something and have it instantly read or viewed by a billion people. And, on the negative side, any tin-pot nut case can acquire the know-how to kill thousands. I wonder how long it will take us to fully grasp today's changes?

Monday, October 23, 2006

rly fun

I sometimes worry that the decision to move to the U.K. was a mistake. I took the kids away from easy access to their grandparents and extended family. But, once in a while, the children have an experience that would be hard to duplicate elsewhere. Right now, my eldest son is on a student exchange in France, living with a family in the south of the country for a week. We worried how he would handle days alone with a strange family, speaking a language he barely understands. Yesterday, I got a text message:

"I went huntin. It was rly fun."

"What were you hunting?" I replied. "And did you kill one?"

"4 pheasants and partridges. I didn't kil. any but they (chris+his dad) got 3 pheasants + 1 partridge."

I'm sure trooping around the French countryside killing birds is improving his French...

Sunday, October 22, 2006

Free software

As I'm considering what I'm going to do with the second half of my life (see previous post), I thought it might be useful to review what I'm interested it. My fascination with free software such as Linux probably marks me most as a nerd. But what I find most fascinating about it isn't the obvious. I'm drawn to the ideas, such as that sharing makes something worth more or that real economic value doesn't have to be derived from scarcity.

Free software is a bit like science. Scientists publish. The data is then ripped apart by other scientists. What holds up to scrutiny becomes the foundation for the next discovery. That process is the way, as Isaac Newton put it, that scientists "stand on the shoulders of giants." Few would argue that science lacks value, or that it would be worth more if kept secret.

Software used to work more like science, back when the commercial value was thought to lay purely with the hardware. Software (in source code form, so that you could see how it worked) was freely distributed and shared. It was when the tide was turning toward closed, commercial software that Richard Stallman, a programmer at MIT, formed the Free Software Foundation. His aim was to create everything you need to use computers -- the operating system, the applications, etc. -- in a free and open way. His genius was to use copyright law to ensure that the software stays free. The FSF's General Public License gives you the right to use the software (for any purpose), share it without anyone, view the source code and modify it. The catch is that you can't remove any of those rights. If you modify the software and distribute it, you have to pass on the same rights. That restriction is enforced by copyright law, which gives the creator of the software the ability to dictate the terms of use. In this case, the terms are the opposite of the usual restrictions ("all rights reserved"), hence the GPL's nickname: copyleft. There are several other free software licenses, but the basic concept is usually the same.

One point of confusion comes from the limitations of the English language. One of the rights isn't that the software should be free, as in no cost. Free software means freedom. You're perfectly entitled to sell your modifications, and people do, but the Internet and its easy copying has led nowadays to most free software developers making their money by selling their skills (e.g., getting a job with a company that uses free software) or by offering technical support (such as Red Hat Inc. or Canonical Ltd.). What's interesting about free software is that the economic incentives, plus "copyleft," ensure that the general level of quality rises. If you're looking to sell your skills, you're going to want to make your work (free for anyone to see) as high quality as possible. If you are selling technical support, you better make your software as trouble-free as possible, to lower your costs. And, in both cases, you're obligated to share your work with everyone else. But what I found most interesting is that much of the work, maybe the majority, is done by people who get no economic benefit whatsoever. They do it because they find the work engaging and they believe in freedom.

Whatever the motivation, the system works, and has generated billions of dollars in market value. Every time you use Google or Amazon, you're using free software. The majority of the Internet is built with it. As thousands of developer tweak, twiddle and fix, that work goes back to the community. Everyone benefits.

As an end user, however, benefiting used to be hard. I remember starting with GNU/Linux, a combination of the Free Software Foundation's applications and a free operating system, back in 1993. Hardware support was limited and it was a real effort to learn. Nowadays, it is so easy it's almost boring. I currently use Ubuntu, one of many "distributions" of GNU/Linux, which handle the work of combining the applications and the operating system into a usable package. A new version is due out this week (Oct. 26). You can try it before installing it by running the software from a CD. Wireless networking can still a be bit tricky, but almost anything else I throw at it (digital cameras, iPods, etc.) is handled with aplomb. Today, you don't even need to use a free operating system to try free software: just try Firefox.

I'm not against commercial software. I use Windows for 10 hours a day. It's fine. But I do believe free software will win out in the end, just as open science won out over alchemy, which was also based on secrecy. What holds back most people from using free software is the subject of another post, but basically it's responsibility. People don't want to take it. Using free software when no one else you know is means taking responsibility for your own computer. People would much rather just let Microsoft (or whoever) handle it. It gives them someone to blame. It's why my children's schools all sign expensive contracts with Microsoft Solution Providers. They don't want the responsibility. I can understand that and feel no compunction to "evangelize."

But personally, 25 years of using computers is enough to demonstrate that commercial companies don't give a fig about me. I'm only valuable to them if I keep on buying. Putting important information in their hands -- such as your photos, important documents and music -- is nuts. I've pumped out a lot of information into proprietary applications that is no longer accessible. I've paid a lot of money just to keep what I have. I'm motivated to take responsibility, and to help our where I can. I report bugs, test new software and helping others when asked. It's the least I can do, and in the meantime I get to watch a revolution unfold.

Saturday, October 21, 2006


I was reading a blog post the other day by Scott Adams, the creator of the Dilbert comic. He advocates using what pop-psychologists and self-help gurus call affirmations. He writes down his goals 15 times a day and says it has an almost a magical effect.

I don't believe "affirmations" are anything new. They were probably invented around the same time as stone tools. People used to call it prayer. Repeating something often enough, in a positive way, obviously has an affect. It keeps the goals in the forefront of our mind and alert to opportunities. Adams certainly has a long track record of achieving his.

But Adams's post brought up a troubling thought: I don't have any idea what I would write down as goals. I have wishes -- I want every member of the family to be happy, for example -- but those are mostly out of my hands, more the domain of prayer than affirmations.

So what would I write down, assuming (and this is a big if) I'd have the patience and the self discipline to do it several times a day? On this side of 40, the question has become more difficult. I wanted to be a journalist and I succeeded. I could always be better, of course, but I've definitely achieved that one. Marriage, family, buy a home? Check. Travel and live abroad? Doing that. Most people at this stage just carry on as before. I'd rather have a more definite destination in mind for the second half of my life. It shouldn't just be a 40-year winding down.

Financially, I don't necessarily want to be rich, but I do want the flexibility to trade money for time. I want to be able to work less and spend more of my time with the family, to pursue my interests, to get involved in causes I'm passionate about. At the moment, I spend the great majority of my waking hours on making money, all of which goes right out the door again to house, feed, clothe and entertain a family of seven in one of the most expensive cities in the world. The rest of the time is spent taking care of -- fixing, filling, emptying, watching, listening to or cleaning -- all of that accumulated "stuff" that seems to gather around us like dust bunnies. Life is upside down. The important is crowded out by the mundane.

Of course, it's not really a finance issue. If I were richer, the ratio of my time spent on "taking care of stuff" and on food and shelter would just shift a bit. It's really about priorities -- setting them and sticking to them -- and then I'm back to the problem of goals.

I started out writing this thinking I'll just list some, but I'm realizing now that it isn't that easy. I'm already making choices -- every time I buy more stuff or stay late at the office -- but with no definite plan. I'm just going with the flow, unsure of where it's taking me.

Answering that question is tougher than I thought -- a worthy goal.

Friday, October 13, 2006

Child Prodigies

I worry too much about my kids. I forget what I was like as a child -- hopeless at school and socially awkward. Yet today, although I'm no Nobel Prize winner, I get by. Ironically, I'm making my living doing precisely what I most struggled with as a student. I learned to write years later than my peers, but ended up employed for the past 20 years doing just that.

One of my favorite writers, Malcolm Gladwell , gave a speech, summarized here, that's really worth reading on the "myth of prodigy." It's a good reminder that I need to worry less if not all my children are at the top of the class. A snippet:

Gladwell cited a mid-1980s study (Genius Revisited) of adults who had attended New York City’s prestigious Hunter College Elementary School, which only admits children with an IQ of 155 or above. Hunter College was founded in the 1920s to be a training ground for the country’s future intellectual elite. Yet the fate of its child-geniuses was, well, “simply okay.” Thirty years down the road, the Hunter alums in the study were all doing pretty well, were reasonably well adjusted and happy, and most had good jobs and many had graduate degrees. But Gladwell was struck by what he called the “disappointed tone of the book”: None of the Hunter alums were superstars or Nobel- or Pulitzer-prize winners; there were no people who were nationally known in their fields. “These were genius kids but they were not genius adults.”
Now look at it the other way:

The other way to look at precocity is of course to work backward — to look at adult geniuses and see what they were like as kids. A number of studies have taken this approach, Gladwell said, and they find a similar pattern. A study of 200 highly accomplished adults found that just 34 percent had been considered in any way precocious as children. He also read a long list of historical geniuses who had been notably undistinguished as children — a list including Copernicus, Rembrandt, Bach, Newton, Beethoven, Kant, and Leonardo Da Vinci (“that famous code-maker”). “None of [them] would have made it into Hunter College,” Gladwell observed.
In fact, being considered a child prodigy can be a handicap, Gladwell says. The one "poster child" for precociousness, Mozart, in fact, isn't. What he did, perhaps via a pushy father, was work his tail off.

Here in the U.K., even more so than in the U.S., the focus is on spotting talent early, and giving up on the rest quickly. You're pegged as university-bound at 11 and football-player material at 7 or 8. Students here have to make drastic choices on their future at 13 or 14, deciding whether to stick with maths and science, or give up on foreign languages. At that age, I'd be pegged as a non-achiever and relegated to stopping after high school (if I even made it that far).

My advice to my kids is to keep your options open, with basic and broad subjects, and don't get pegged. It's hard work, like swimming upstream. The schools want you to decide, or they'll decide for you. I need to send a few of those teachers a copy of Gladwell's speech.

In the meantime, though, all of my kids are more socially adjusted and doing better at school than I ever did. Their struggles are minor, not major. They'll be OK.

Cures for the Common Cold

I've been bedridden and out of work for a couple of days now with a cold I dragged back from the U.S. I must not be used to the germs there, because this one has floored me more than usual. I think I caught it from my father, who was over it in 12 hours.

With little else to do, I've been thinking about all of the proposed cures. My mother's latest preventative, if not cure, is to take a protein drink a day. Yes, the same stuff bodybuilders drink to "bulk up." My very thin mother needs to bulk up, I don't. My wife Theresa swears by Echinacea, an herbal remedy. I believed it worked if I started taking it just at the beginning of a cold, but it hasn't touched this one as far as I can tell. Vitamin C is one of those cures that only seems to work if you believe it does, same with Zinc. The Wikipedia article on the Cold lists a few more, albeit similarly iffy, cures.

For now, the only cure I've found is to sleep through it.

Thursday, October 12, 2006

Family at a Distance

My sisters

I was home over the weekend for my sister Dara's birthday and, finally, got a few of the family members on Skype. We haven't tried to use it yet, but I'm hopeful it'll let me hear their voices now and again. At the moment, we see each other only every couple of years, at most. I met my youngest nephew for the first time and reacquainted myself with the older ones, so that I'd at least recognize them if I met them on the street. I also got a quick top-up on Americana -- watching a school football game, in person, and a baseball game or two on my father's giant-screen TV.

But the most important thing I've learned in the last two trips to the U.S. (the prior one being for my mother-in-law's funeral) is that I need to do more on my end to keep in touch. Letting this blog languish isn't helping, nor am I going out of my way to email. So, for the time being, I'm back to posting and will try to keep it up, even if I don't have anything brilliant to say.