[I just posted this to the Couchbase Mobile community mailing list.]
TouchDB is a project I’ve been feverishly working on for a few weeks. It’s an investigation into the feasibility of a CouchDB-compatible database rewritten from the ground up for mobile apps. The comparison I like to make is that “if CouchDB is MySQL, then TouchDB is SQLite”. In fact, it uses SQLite as its underlying storage engine. You can read a longer justification for it on its wiki, as well as an FAQ and design document.
— It speaks CouchDB’s replication protocol. I’m pretty serious about
that; I’m even documenting the
— It also understands a large subset of the REST API, enough so that it works with CouchCocoa. I’ve got a clone of Grocery Sync working as one of the demo apps in the project.
— The current implementation is for iOS. If the investigation pans out we’ll port it to Android, and possibly other platforms.
TouchDB is certainly not ready for prime-time yet, but here are some current statistics to whet your appetite:
August in Los Angeles was bone-dry and dusty, but he left it behind in the parking lot as he made his way through the series of three doors, heavy and white, and into the frozen refuge of the ice bar. He was known, there, and the hostess greeted him with a sealskin robe, slipped over his shoulders before he had time to start shivering. The tip of her elegant nose felt icy against his own.
There was room for one more at the bar, and at a nod from the chef he took the seat gratefully. One often had to wait, stamping feet to ward off the cold. The chef slid the amuse-bouche before him as he unfolded his napkin, and it was exquisite in appearance: a translucent carpaccio of walrus blubber sprinkled with snowflakes. The snowflakes were not unique, in fact they came in precisely two shapes, one sprinkled on the left side of the dish, the other on the right. They made not-quite-imperceptibly different crunches as he ate them. It was touches like this that had made the chef’s name when he was but a young man just arrived from Nunavut.
I really don’t know how long I’ve been lying on the couch, watching the men on the TV. I don’t remember things so well anymore, since the accident. I don’t remember the accident either, but my friends tell me it was pretty bad. I have healed about as well as I’m going to, and though I don’t get around well, I can still think. In small doses.
The men on the TV gesticulate about some crisis or other; I can’t tell what, because the sound is off. They look angry — at me, at all of us, at themselves. Small text crawls across the screen above and below them. The TV men look very tired, too, as tired as I feel, and perhaps lost and afraid. I feel such sympathy; I would like to turn up the volume and learn more of their situation. Maybe I could ask one of my friends to.
I had lost this historical document for a long time, but finally found it the other day on an old backup CD. It’s the original 1997 sketch I made of a chat user interface based on speech balloons.
After digesting yesterday’s iPhone announcements [with fava beans and a nice Chianti] I started thinking about the pricing models made possible by the “Application Store”. In particular,
How cheap can an iPhone app be?
I think the answer’s clear. The Application Store will obviously be based on the iTunes store, whose bread-and-butter is a product, the AAC audio file, that sells for … 99¢. Apple’s clearly able to make a profit at that price point, despite credit-card processing fees, bandwidth costs, and comparable payments [Updated. Thanks, Dru!] to the record labels. So I see no reason they wouldn’t allow a developer to price an application that low.
But why would a developer want to sell an application for a net 70¢?
Because at such a low price, with a one-click store a couple of taps away, it becomes an impulse purchase. It’s a form of micropayment, an idea that’s been talked about for years but hasn’t widely taken off due to the practical difficulties of collecting very small payments. The few areas where micropayments (albeit larger than the canonical 1/10¢ originally proposed) have worked include the iTunes store, and the downloadable-game stores for the Xbox and Wii.
And let’s not forget the most amazing example of what people will pay for if you make it convenient enough: ringtones. The practice of charging suckers $2 for a 30-second snippet of a song they already have, is a multi-billion-dollar industry.
Here’s a career update, for those of you who care: I’ve left Apple, and I’m now working on my own, from home, as an indie software developer. I have plans for at least two kick-ass Mac apps, I’ll probably contribute to a few open source projects, and I may dabble in some web stuff.
(At least, that’s the plan for now! Everything is subject to change without prior notice. This document contains forward-looking statements. These statements involve risks and uncertainties, and actual results may differ.)
This is kind of a big change for me. I’ve been continuously employed for 19 years, 16 of those at Apple. I clearly like being part of a team, part of a company, and specifically part of Apple. But there comes a time when a man’s gotta do what a man’s gotta do.
Warren just emailed me this photo, commemorating the release of the first Apple product I worked on. That’s me on the second row, second from the right, with the dopey expression (but no gray hairs.)
Paul Graham [who is obnoxiously elitist, but frequently insightful] has a new essay, “Holding a Program in One’s Head”, that is making me feel sad this morning.
We love to play the Hero — exploring dungeons, grabbing treasure, saving the world from evil. But I started wondering about the reasons behind some of the actions in such games, and especially about what my Heroic deeds looked like to the ordinary people of the lands I passed through. (As my wife once put it: “Why isn’t there a Hug button?”) The result is this story.
I don’t normally write this sort of antiquated prose, but the genre does require it. It was actually a fun exercise, and I’ve tried to affect more of a James Branch Cabell or Lord Dunsany voice, rather than the tiresome faux-Tolkien of most current heroic fantasy.
Here’s my family recipe for apricot jam, handed down through generations. One generation, really — my mom got it from a pamphlet put out by some local womens’ group, after we moved to an old ramshackle house in the middle of a huge but disused apricot orchard. The trees were old, but a lot of them still produced fruit, and it was no trouble to walk around and collect bucketsful. So we needed some way to make use of all that fruit…
This recipe is different from the usual one you find packed in a box of pectin, because, well, it doesn’t use pectin. Instead, you thicken the jam by cooking it a lot longer. This means it tastes less like fresh fruit; but it has a wonderful taste of its own, a bit like dried apricots, and a nice gloopy texture. As a bonus, putting an apricot kernel1 in every jar gradually adds an almond-y aroma2.
Ever since Brent “NetNewsWire” Simmons posted his Thoughts On Large Cocoa Projects the other week, I’ve wanted to add some of my own tips. I’ve worked on some big projects (iChat, Apple’s Java runtime, OpenDoc) and have sometimes had to find my way around in others (Safari, Mail), so I know what Brent means when he says:
There’s no way I can remember, with any level of detail, how every part of [my app] works. I call it the Research Barrier, when an app is big enough that the developer sometimes has to do research to figure things out…
It’s been said many times that “the main person you’re writing comments for is yourself, six months in the future.” It’s always a good idea to keep that shadowy figure in mind while you code. Here are some other techniques I’ve found invaluable:
The problem with writing about something I dislike is that, after the momentary pleasure of getting it off your chest, there’s not a lot of motivation to read people’s responses (especially the argumentative ones.) Better to pick as a topic something that I do like very much … such as music.
I can’t claim to be an expert on music: I can only barely play an instrument, my dj skills are wack, the theory hurts my brain, and my knowledge is encyclopedic only in a few micro-genres. But I’m rabidly enthusiastic about it; and fortunately, music nowadays is tightly entangled with computer technology, which (like any engineer) I can easily sound like an expert on.
Like most geeks, as a kid I not only despised the Cool Kids, but also wanted to be one of them too. My own school-age development trajectory took me from a state of total ignorance of what that required1, to brave attempts to fit in2, to a realization that different was cool3.
Anyway: these days being a Cool Kid is within every geek’s reach. Perhaps that’s because the shared culture has exploded into an uncountable number of fragments, each of which is a tribe with its own parallel hierarchies of coolness. Amen to that.
I write code for a living. After all these years I still find it really exciting — I was instantly and permanently addicted at age 11, it’s just that the programs have gone from 20-line BASIC powers-of-two table printers, to enormous Java and Objective-C juggernauts — and moreover I’ve found it’s the one thing that I can work on consistently enough over a long period of time to finish a project of any size. My childhood was littered with unfinished stories, unfinished plans for undersea cities, unfinished D&D maps. But the programs got finished. (Most of the time.)
Herewith, entirely too much detail about the different programs people have paid me to write. Read on if you want, but you’re in the driver’s seat so feel free to hit that Back button if your eyes glaze over…
Just when it seemed, a decade ago, that the programming world had settled on C++ as the lingua franca, the One Language To Rule Them All, instead we got an explosion of new high-level languages that have risen to popularity. Why did this happen? Chiefly because the World-Wide Web has conditioned users to expect five-second delays before any responses to their actions, which provides an environment ideally suited for interpreted, garbage-collected scripting languages. This movement has been encouraged by server vendors like Sun and IBM who are eager to show Web developers the productivity increases they can get by using such languages, especially after they then install massively powerful servers.
Much of what I’m consumed with (at work) boils down to a question of: what is the right shape for the small but plentiful bits of writing that we are all creating daily? Here shape means largely visual representation but also sequencing and topology.
It’s a problem of hypertext, primarily. The World Wide Web established one shape for hypertext: individual pages with one-way links in the text, replacing one another in a back-forwards chain. It’s proven to be a pretty good shape, but it’s not the only one, and earlier thinkers like Engelbart and Nelson had lots of other ideas.
This is one of my favorite interviews ever, and it reminds me of a long-gone era when the Cocteau Twins mattered, mattered really deeply, and were making music I could barely believe possible. Music I was not the only one to find wholly impossible to describe…
Time to bore you young whippersnappers with my early history in computers. (I saw a couple of other people do it and thought hey! I can do that too First we have to set the Wayback Machine for the darkest depths of the ’70s, a decade that’s oh-so-much funner as retro than it was to live through…
The Simple Mail Transfer Protocol is by no means perfect — its lack of authentication is a prime reason why spam is such a problem — but I think it got one thing right: it has the right topology for building a person-to-person communications system.
I decided I would only work on this page after ten PM, when I think differently. When I’m tired but alert, and everyone around me is asleep, and it’s dark and quiet.
The cognitive scientist and AI researcher David Gelernter has a model of consciousness that has focus as its parameter: varying focus produces mental states from rigorous logical thought (when focus is at its highest) all the way down to dreaming (when focus is at its lowest.) In high focus states the mind seizes precisely on individual concepts and ideas. In low focus states, multiple ideas, concepts and memories overlay each other such that they can’t be distinguished; they’re superimposed and common features line up, connections between disparate thoughts. In this mode the mind jumps from one memory to another, linked by a chain of connections formed by lining up fragmentary images of those memories. Like a dream.
The beach that fall was overrun by glass lizards; I had never seen them before. I knew there had been none at the beach when we had visited during the summers (I could remember four summers, and my parents said there had been more.)