Monday, November 29, 2010

A musing: measuring value

In the previous post, I asked how much of what humans consume is produced through non-human means. That question hits upon one of my core criticisms of modern economics, which is that though economics is the discipline that is suppose to be able to compare apples to oranges, it is, in my opinion, exceedingly bad at doing just that. However, to the discipline's credit, comparing apples and oranges may turn out to be a fundamentally hard problem.

The recessionary events of the last few years have hopefully stirred up people's imaginations and led them to asking, “What's going on?” What finer question can one ask! For those of us who disfavor resorting to simple, ready-made answers to such involved and difficult questions, we soon run into a core problem in economics: by what unit do we measure value? An apple's value can easily be compared to an orange's if apple and orange both can be converted to some common medium of exchange. In the case of real-world apples and oranges and most anything else, that medium of exchange is currency, and in my country, the dollar.

However, most of what accounts for wealth these days is not in the form of apples or oranges or anything else tangible and intrinsically valuable. It's in the form of strictly monetary assets such as bank account balances, shares of stock, mortgages, futures, options, and so on, all of which are reducible to someone somewhere promising to pay something to someone sometime else—IOUs. Or, on the flip side, someone somewhere staking a claim on some future production. One problem this poses for us is: if most of what constitutes wealth nowadays is money, then how accurate is it to use money itself as measurement of value? It's easy to know the length of things when you have a ruler, but if everything in the room is a ruler, and every ruler has a different-sized inch, then it's nigh hopeless to know the length of anything. That's irony. It's also what passes for modern-day mainstream economics. We know the price of so much but struggle with knowing value and thus with questions such as, “How we doing compared to X years ago?”

There's a Native American proverb that says, “Only when the last tree has been cut down; only when the last river has been poisoned; only when the last fish has been caught; only then will you find that money cannot be eaten.” As stupidly simple as this is, it's worth repeating that money has no intrinsic value, and so the value of measuring the value of something like an apple or orange in dollars is done so with the expectation that dollars are stable and universally transferable. Currency itself is otherwise meaningless as a metric. History shows this assumption holds true most of the time and fails spectacularly during the remainder. So it is that I expect an economist to relate the value of things in dollars or any other currency most of the time but not to rely upon the conversion exclusively. But that is not what I observe in fact. Stats such as GDP and trade deficit/surplus are attempts to squeeze value into currency alone, and such monetary metrics appear to hold a monopoly over the minds of mainstream economists. The problem with the strategy is that a significant portion of the production being measured is itself money.

I think the reason for this shortcoming is that we haven't yet figured out a suitable alternate metric. My own guess is that there doesn't exist a simple scalar unit any better than currency; any superior alternative won't mask as much of the complexity of the measurement as currency does. The situation reminds me of truing a bicycle wheel. For those of you who haven't ever attempted to true a wheel, know that it's a hard task that's more art than science. A bicycle rim is made into a circle (or something closely approximating a circle) by adjusting the tension of the spokes. Each tightening or loosening of any one spoke affects the tension of the many spokes near it so that fixing a wobble in one part of the rim may create another wobble or two elsewhere. There's no simple way to analyze a wobbly rim and say, “Aha! That's the one spoke that needs to be adjusted and by yea much.” The practical way to true a wheel is to start at the biggest wobble by adjusting its nearest spokes and then work outward spoke by spoke, possibly many times, to minimize the effects of the spreading waves of newly introduced wobbles. Eventually, if you're doing more good than harm, the wheel is made true enough.

So I suspect that something similar is going on with economics. Any measure of value is hopelessly tied to something else of value, and I suspect that any superior metric won't hide the resulting non-linearity. But what would such a metric look like?

Saturday, November 27, 2010

A question: production

I messed up the scheduling for auto-posting what should have been last Thursday's post. Here it is with the usual apologies for lateness. Have a happy Saturday After Thanksgiving.

Today I'd like to pose a question. It's one that's too tough to answer, but it's fun to think about how one would even go about trying to answer it. That said, here's the question.

How much of what all of humanity consumes is not produced by humans?

Some of what we consume is produced because of human-driven processes, and some of what we consume is not. For example, only humans are capable of assembling the laptop I'm using to compose this blog entry, and thus laptop assembly is produced by humans. However, the sweet potato I'm planning on eating for supper tonight was created due to many non-human inputs, such as sunshine and photosynthesis, and if I didn't buy my food through the grocery store and the industrial agriculture machine but instead grew the potato myself then the percentage of non-human inputs could have been larger. For example, the potato could have been grown without chemical fertilizers or irrigation but instead only have used naturally existing soil nutrients and rainfall. Well, probably not rainfall. Not here in Phoenix.

Indeed, little is entirely human- or non-human-made. The most “natural” of resources must often be transported to their place of consumption, and transportation involves some bit of human input, whether it's trucked (a complex activity) or tossed into a river to float downstream (a simple activity). Contrary, the most “unnatural” of resources, such as my laptop, contain countless raw inputs that cannot be synthesized through human means alone. In the case of my laptop, even the humans assembling it require countless non-human-made inputs, especially if they're eating organic sweet potatoes on their lunch breaks. An input of an input is still an input.

The proposed question is hard exactly because it seeks to quantify a supply chain that, in theory, regresses backwards in time to a seeming infinity of breadth. Each supply-chain input to my laptop has its own production process involving many more inputs, each with its own production process with even more inputs and so on. If you go back far enough, everything we touch, including ourselves, is nothing but exploded stardust. But the question isn't intended for reduction to absurdity. Rather, it's intended to ask: how much of what we're consuming can we give ourselves credit for producing? Do humans account for the production of even half of what we consume?

These questions have to do with a problem with modern mainstream economic thought: that which cannot be monetized cannot be accounted for and therefore isn't accounted for. My laptop cost me about 1,000 sweet potatoes (I know, I overpaid for that video and memory upgrade), a comparison that I can make because both laptop and sweet potato alike can be converted to a common medium, the dollar. But how much is the air I'm breathing worth, measured in dollars? Right now, it's worth nothing because there's no shortage of it (and no control over it). But if there were a shortage then it would become priceless. Either way, air is something that cannot easily be priced, and yet air is obviously critically important and oxygen specifically—volatile and short-lived free oxygen—is available only because of the biosphere's photosynthetic processes going on right now. No laptop is going to be assembled without those photosynthetic processes. How does one account for that in the ledger? Think about the matter a little more and you'll undoubtedly be able to come up with many more natural, “priceless” supply-chain inputs like sunshine and rainfall for the production of nearly anything, but being priceless, the inputs aren't accounted for and cannot be accounted for. But of course, not being accounted for doesn't make an input any less critical.

Monday, November 22, 2010

Inexpressible in C

Recently at work I encountered something I'd never previously seen in the C language: an inexpressible pattern.

(To note: I'm using the term “pattern” here to denote a design at an abstract, conceptual level, like where someone may say, “Just code the state machine in a switch block within a while loop.” In such a case, implementing a state machine where each state comprises its own unique case block within the switch constitutes a pattern.)

What I ran into at work was a pattern—something I wanted to implement—but couldn't because the language won't allow it. I don't remember this ever happening in C. Sure, there are countless patterns that are ridiculously awkward when expressed in C—for example, object hierarchies—but this pattern I wanted to implement cannot be expressed at all, as far as I can figure out.

What pattern is this? Succinctly, it's the pattern of “a function that returns a pointer to its own type.” This may sound simple enough, but it can't be done—not directly, not exactly, at least—in C. A little code may illustrate best what I'm writing about.

typedef void * (foo_t)(void);

foo_t *bar(void);

Above is a function, bar, that almost returns a pointer to its own type. What it returns is a pointer to a function that returns a pointer to any type, which presumably could be the address of bar. Of course, if I were tolerant of void pointers being tossed about, I could have made things much simpler, like so:

void *bar(void);

Here, bar could return its own address or the address of any other function with the same type or any address at all. But what I wanted to express was a function type, foo_t, that returns a specifically typed pointer to its own type. And that can't be done (as far as I can figure out) because one must replace the void * in the typedef declaration with a infinite regression of cascading types. Function types, in C, cannot return themselves.

I find this interesting for two reasons. The first is that this is exactly the sort of self-referential hole that shows up seemingly everywhere when you're dealing with complex systems. Douglas Hofstadter would be amused. The second reason is that I wonder why I'd never previously tried to implement this pattern. What's unique about my current project that led me to wanting this pattern for the first time?

The most obvious answer is that this is my first real small-system embedded project. The device has no operating system, so my main function really is all there is in the device's little software universe—excluding startup code, of course. What I wanted to use the function-returning-its-own-type pattern for was for implementing different modes that the device can switch into. For example, among other modes, the device has a normal operating mode and a diagnostic mode, and so each mode naturally calls for its own sort of “main” function, like so:

void main_diagnostic_mode(void);
void main_normal_mode(void);

Each modes' main function would then be called from the actual main function whenever appropriate to do so.

However, each mode can end and cause another mode to begin. For example, a user command in normal mode may cause the diagnostic mode to begin. In all cases, which mode should begin next is determined by the logic of the current mode. This suggests the following prototypes:

mode_main_fn_t *main_diagnostic_mode(void);
mode_main_fn_t *main_normal_mode(void);

Here, mode_main_fn_t * is a pointer type that can address main_diagnostic_mode, main_normal_mode, or any of the other modes' main functions. Thus, each mode's main function runs for some indefinite duration and, upon returning, elegantly notifies the main function which mode to run next.

int main(void) {
    mode_main_fn_t *mode = main_normal_mode; /* initial mode */
    while (1) {
        mode = mode();
    }
} 

But, as I described above, this pattern cannot be expressed in the C language unless one defines mode_main_fn_t to be void, which is like cheating. So are typecasts cheating, too.

There are, of course, many solutions to implement for this problem in general. The pattern that I wanted to implement is only one possible solution. That I went about solving the problem a different way doesn't make inexpressibility in C any less interesting.

Thursday, November 18, 2010

Givers and takers

Do you perceive yourself to be someone who gives more to society than you receive from it, or do you perceive yourself to take more from society than you give to it?

This is an interesting question to ask, both of yourself and of other people. Try it, both on yourself and on others around you. You'll likely find that two people similar in circumstances and demographics can and often come up with passionately conflicting answers. And of course things become even more interesting when you ask the age-old follow-up question, “Why?”

People disagree on specific issues all the time, from the all-affecting what should we be doing about taxes? to the day-to-day how much respect and courtesy am I obliged to give the grocery store cashier? However, I propose that many of our opinions on the specific issues are heavily influenced, if not outright dictated, by whether we perceive ourselves as net-givers or net-takers (or the occasional Even Steven). When two people disagree on this self-perception, what easily can happen is that they end up talking past each other on the specific issues. Each hears what the other says, but it's as if each is speaking a foreign language. I think this pretty much sums up a large part of public political discourse in the United States: a lot of talking; even a lot of listening; and exasperation with how, well, other people can think so wrongly.

There's nothing much complicated about this. Generally speaking, self-perceived net-givers see themselves as be owed by society. After all, if you're giving more than you're taking, that usually means you're entitled to something in return. Contrary to this, self-perceived net-takers generally see themselves as owing something to society. Within this simple difference in perspective can lie the differences between two complex mazes of logic rationalizing whether taxes are too high or too low as well as whether it's okay to be a little rude to the grocery store clerk who overcharged you for those apples. Either you're owed or you owe, and from this much of one's moral world view follows.

For the record, my own gut feeling is that I'm a net-taker. I base this on the idea that my level of affluence, though semi-modest by American standards, puts me within the top one-tenth of the world's population and that, simply, I consume more than my equal share of the world's resources. (World per capita purchasing power parity GDP is around the US poverty line.) Others may counter my assessment by pointing out that I account for more than my equal share of production. I can then (1) state skepticism that the free market is a fair and just metric for value or (2) claim unfair inheritance, the idea that if my productivity is higher than the world's average than that is due mainly to having been granted a superior education and other childhood services. Some possible counters to this are (1) childhood privilege is irrelevant, (2) giving and taking are not zero-sum, and (3) the American way of life is morally superior to other ways of life. And on and on it can go, each side exposing core assumptions, which is exactly the beauty of the question. It explores the very way we perceive not just ourselves but the world.

So what do you think? Do you perceive yourself as someone who gives more to society than you receive from it, or do you perceive yourself as taking more from society than you give to it?

Monday, November 15, 2010

Go: first impressions

Starting about a week ago, I've taken it upon myself to learn a new programming language: Go. This marks the first time in about six years that I've attempted to learn another language. Now that I'm about one week into the process, I've decided to describe here some of my first impressions.

Go is intended to be something of a C replacement. This I find interesting because I am, above all else, a C programmer. I am one of those guys who codes in C not just when I'm paid to do so but when, on those rare occasions, I do something “for myself” and want to make something that lasts, something that survives endless maintenance. It's not that C is the best language for this; rather, it's just that it's the best language for me, the one I think in most readily. Thus, all of my first impressions of Go are cast from the point of view of a C guy. Pardon the pun.

My method for learning Go has, so far, entailed reading through the documentation on the official website while working on making a simulator for the Acquire board game. Acquire, for those who don't know (and you should be ashamed!), is kinda like Monopoly in that it involves real estate trading and the goal of making money. My goal with the simulator is eventually to experiment with writing AIs for the game. I figure this is a good project in that it's non-trivial but affords itself to a highly irregular schedule when it comes to personal coding time. It also serves as a vehicle for learning Go.

One of Go's “guiding principles”, according to the FAQ, is to cut down on bookkeeping—those mundane, repetitive tasks in each language that end up being done for every non-trivial program. Here's the excerpt from the FAQ.

Programming today involves too much bookkeeping, repetition, and clerical work. As Dick Gabriel says, “Old programs read like quiet conversations between a well-spoken research worker and a well-studied mechanical colleague, not as a debate with a compiler. Who'd have guessed sophistication bought such noise?” The sophistication is worthwhile—no one wants to go back to the old languages—but can it be more quietly achieved?

In C, for example, any sufficiently sophisticated program ends up requiring a lot of work in: managing header files, memory management (even when no clever tricks are being employed and everything is by rote), and defining data structures. There's no way around this; it's an emergent property of the language.

My first impression of Go is that it does indeed live up to its guiding principle by reducing bookkeeping while maintaining flexibility and expressiveness. However, this is a pretty abstract and subjective point. Here are some more succinct, concrete impressions.

  • A lot of Go's syntax is backwards. Or at least it seems backwards after being firmly ingrained with C's arbitrary way of doing it.

    C Go
    int i;
    int *p, *q;
    int a[10];
    char const *s = "Hello, world.";
    char const *t = "Goodbye.";
    var i int
    var p, q *int
    var a [10]int
    var s string = "Hello, world."
    t := "Goodbye."
    typedef struct {
        int a;
        float b;
        char *c;
    } T;
    type T struct {
        a int
        b float
        c string
    }
    int foo(int a) {
        return a + 1;
    }
    func foo(a int) int {
        return a + 1
    }

    The above table illustrates the “backwards-ness” of Go. Of course, the order of the lexemes in, say, a declaration shouldn't affect a programmer's ability to organize his thoughts and solve problems, but after spending more than half my life in the C family of languages, such a change took some extra time for me to get use to. Like, an hour or two.

  • Go programs do indeed compile quickly, as advertised. Also, the compiler's error messages are meaningful and obvious.

  • Go doesn't have a while loop. The only looping constructs are for and goto (and the latter should never be used for looping, IMO). I'm not a fan of this because for years now I've used while to designate a “loop which does not have a set number of iterations known before the loop begins its first iteration.” Yes, I realize that while and for are basically the same thing in C, but in that language it's nice to choose the loop construct to connote further information to future maintainers.

  • Like many other modern languages, Go uses packages to modularize code. One trick I've never seen until Go is how the language determines whether a symbol is public or private based on the capitalization of the first letter of the symbol. So, for example, foo is private to its package, and Foo is public. This, along with the “opening braces go on the same line as the if and for”, makes it obvious that Go is trying to enforce a particular coding style. I'm not sure how I feel about this. I see the pros and cons.

  • Go's slices make a lot of sense and seem especially practical. Part of the extra bookkeeping in C is how most arrays require some sort of length and/or capacity meta-variable(s) kept in sync with the array. Go's slices are basically like array pointers that keep track of length and capacity automatically without adding the overhead that other higher-level languages typically add to array/sequence types, like smart insertion and dynamic reallocation. I'm interested in Go as a C replacement, not as a yet-another-great-prototyping-language, like Python.

  • I'm looking forward to using Go's panic and recover. The error-handling scheme provided by these two keywords appears to be just the right amount of structure to retain the simplicity and efficiency of a goto drain in C without all the arbitrariness and bloat of full exception handling.

All in all, after spending ten or so hours working with Go, I generally find my initial negative reactions, like with the “backwards-ness” of the syntax or the semi-forced coding style, to diminish while my appreciation of Go's power and elegance increase. My overall initial impression of the language is that it holds a lot of promise for doing what C does, better.

Thursday, November 11, 2010

New World Keyboard

Awhile back, about seven or eight years ago, I made the switch from qwerty to Dvorak. Somewhere and somehow, through details long since forgotten, I learned about the promises of faster, more accurate typing and decided, “This is for me.” I even bought a special keyboard—this being the days of desktops, before laptops became ubiquitous, when one could switch keyboards with ease—that had both layouts' key assignment printed on each key and a hardware switch for toggling the layout in use. So began my adventures as a Dvorak typist.

A few months later, I was firmly again a full-time qwerty typist. Though I managed to learn how to type streams of text in Dvorak just fine in this short time—about as fast and accurate as I typed in qwerty—I encountered an unexpected problem in the switch. Typing did seem easier with the new, more ergonomic layout, but I discovered that I, probably like most computer users, use the keyboard for much more than merely typing text. More on this later, though, for the story continues.

Several years after my failed switch, Coworker Shafik took advantage of a rather ridiculously large quantity of downtime at work to learn touch typing. He was impressed with my own typing skills, which, though not great, are above average, and began asking me questions about how I learned typing. Often verbose when talking about my favorite subject—myself—I ended up relating, among other topics, my failed switch to Dvorak, at which Coworker Shafik suggested, “You should relearn Dvorak.” I had that same ridiculous quantity of downtime as he and decided to give it another shoot.

That was two or three years ago, and I continue to use Dvorak for most typing I do. I'm faster and more accurate with it than qwerty, though only by a little bit on both counts. Being an effective Dvorak typist puts one in the strange situation whereby one is handicapped by the default software keyboard settings on most computers and thus encounters a world made harder than need be. Overall, now that I've made a successful switch to Dvorak, I consider the Switch to be a nearly religious experience. It certainly opened my eyes to more than just typing. I'm not really sure what the best way is to go about describing the mysteries of Dvorak, so I'm going to try something new and describe it FAQ-style. I hope you find this interesting.

Q: Why Dvorak?

A: Many people are aware by now that the “traditional” qwerty keyboard was designed not for ease of use but to slow down typists and thus solve a specific hardware issue of a subset of typewriters a century ago: sticking arms. Dvorak, on the other hand, is designed to be the fastest, most efficient keyboard layout possible (for English typists), and irregardless whether it actually is the fastest, it certainly employs many of the design principles used in any keyboard layout designed for speed and comfort.

Q: What are some of these design principles that make Dvorak so great?

A: Well, it wasn't really my intention to turn this discussion into a technical one about the merits of Dvorak. Such points are already described on the Web and can be found with a quick Google search. However, I'll provide a brief summary of some of the principles. For example, because Dvorak puts all the vowels on one side of the keyboard and most English words are spelled by alternating between consonants and vowels, many words are typed on Dvorak using a rhythm of alternating use of the left hand and right hand for each key press. Try, right now, tapping your index fingers of each hand in alternating succession: left, right, left, right, left, right, etc. Try to tap as quickly as you can. Now try alternating two fingers of the same hand. You'll discover it's easier to alternate fingers of a different hand than of the same hand. Thus, the ideal keyboard layout should seek to maximize left-hand, right-hand alternation. Another example is sweeping, which is the idea that it's faster and easier to press keys from pinkie to index finger than it is from index finger to pinkie. On a qwerty keyboard, it's easier to type “asdf” than it is to type “fdsa”. Thus, the ideal keyboard layout should seek to maximize out-to-in sweeping. With the Dvorak layout, many common letter patterns are swept out-to-in, such as “th” and “sn”.

Q: But you say you're only a little faster on Dvorak than qwerty, so Dvorak can't be that great, can it?

A: Now we're getting to the interesting questions! Objectively, the fastest typists in the world don't bother with qwerty; Dvorak is much superior once one advances to the point where the bottleneck for speed and accuracy is the physical movement of the fingers. The problem is, and the reason why I don't recommend learning Dvorak to others, is that few people advance to this level. I haven't, and I can type probably around 60 wpm or so.

Q: So what is the bottleneck for speed and accuracy for most people?

A: The bottleneck is the mental aspect of fast typing. Typing is a physical activity, not much different than any type of endurance racing. A reliable method to advance to a mediocre level is to put in a lot of time practicing/training at a comfortable speed but no significant effort specifically for increasing speed. To advance one's skills beyond this mediocrity, one must train specifically for speed, much as one trains to run faster: i.e, intervals and other intensity workouts. This is a lot of work and is simply not worth pursuing unless you really want to type 100 wpm or faster.

Q: But still, you say you're a little faster with the Dvorak layout than qwerty, so it seems like there's some benefit for people making the switch, even if the physical layout of the keys isn't the bottleneck for most people. What's the problem? Is the learning curve too steep?

A: Believe it not, no. I suspect anyone can become a superior Dvorak typist after spending an hour a day for 6-8 weeks. This is a rather small commitment considering that doing so endows one with a lifelong skill.

Q: So then what's the problem?

A: Actually, to be clear, there's another benefit of using Dvorak as well, which is that the layout is more comfortable and one's fingers stay more relaxed using it because they generally go through more natural movements than with qwerty. However, on the whole, I don't recommend learning Dvorak.

Q: Why not?

A: This question really strikes into the heart of the Dvorak mindset. Look, if all one did with a computer keyboard was input streams of English text, then Dvorak would be the way to go, no question. However, much keyboard use is not this use case. I'll give you an example. Which keyboard shortcuts do you commonly use?

Q: Shortcuts? You mean like Ctrl+S for saving a file?

A: Exactly.

Q: Well, that—Ctrl+S—is definitely one of the common ones I use. So is Ctrl+O, open, for that matter.

A: How about copy (Ctrl+C), cut (Ctrl+X), and paste (Ctrl+V)? You probably use those three all the time, right?

Q: Good point. I do indeed use those three all the time.

A: And many more, surely?

Q; Yes— By the way, how is it that I'm doing the answering and you're asking the questions?

A: Ha ha, you got me there. Well, this answer required some asker participation. Okay, so as you're typing an email, you do a lot of Ctrl+C- and Ctrl+X- and Ctrl+V-pressing for manipulating the text you're entering. For example, if you decide to swap the order of a couple of sentences, you don't delete one and retype it in the other position. You cut and paste, Ctrl+X and Ctrl+V.

Q: Sure, what does this have to do with Dvorak?

A: Well, though the Ctrl key stays in the same place in the Dvorak layout—all the special keys do—the ‘C’, ‘X’, and ‘V’ keys are all swapped. You press what on a qwerty keyboard are ‘I’, ‘B’, and ‘>’.

Q: So? Though I find it strange that ‘V’ is in the ‘>’ spot, presumably you already memorized the different key positions when you spent that one hour a day, 6-8 weeks learning Dvorak. So when you want to press Ctrl+V and the ‘V’ is now in the ‘>’ spot, you will automatically make the mental adjustment and press the correct key combination.

A: No! That's the strange thing. You can learn how to type streaming text using Dvorak and become quite proficient at it, but you must relearn all the key combinations separately!

Q: No way! Really?

A: Yes, really.

Q: You're telling me that I can spend two months learning Dvorak and become faster on it than I ever was on qwerty and yet, at the first need to save a file in my favorite word processor, I'll fumble at the key combination for Ctrl+S and instead end up pressing whatever key happens to be in the ‘S’ spot on the qwerty keyboard?

A: Yes. In this case, that key combination is Ctrl+O, so you end up doing the opposite of saving: opening.

Q: That's a little hard to believe.

A: Yes, it is. I suppose one must try it for oneself to see.

Q: Which you don't recommend…?

A: Correct.

Q: Still, this seems like a pretty minor inconvenience. So you must relearn a few commonly used key combinations. How hard is that?

A: Admittedly, this is not a total deal breaker, though try to keep in mind that each application you use has many of its own key combinations, and the total number of combinations adds up to a lot. Also, some of us elect not to use your favorite word processor and instead spend most of our text-editing time in our own favorite text editor, which happens to stem from a keyboard-only esoteric little program from the 1970s whereby the user relies upon hundreds of commonly used key combinations, many of which are much more complicated than a simple control-key-plus-some-letter combination.

Q: Such as?

A: Such as: :%s/%20/ /g.

Q: What the heck is all that junk?

A: That would, in my favorite editor, search-and-replace all instances of “%20” with a space character.

Q: …?

A: Or q: to access the command history, including that search-and-replace command; k to move the cursor up a line; f0ref r. to change “%20” to “%2e” and the space character with a period; and enter to rerun the newly modified search-and-replace command.

Q: Allow to ensure readers, for they cannot in fact see my eyes, that my eyes are indeed rolled back into my head and I'm about to lose consciousness…

A: I'm just trying to make a point. It's easy to relearn the 26 letters of the alphabet as well as dozen or so punctuation keys, but it's much harder to relearn all these special key combinations.

Q: But most people don't use your little esoteric program and know only a few key combinations. It seems like they wouldn't be as handicapped from learning Dvorak as you were, and since it worked for you then it should work for them.

A: There's more to it than just the key combinations.

Q: Such as?

A: Such as the problem of retaining qwerty skills.

Q: Are you suggesting that knowing how to type on qwerty is not like riding a bicycle? (Though you did compare typing to “any type of endurance racing”.)

A: Well, you got me there. Yes, indeed, if you spend all your time typing in Dvorak, then you will become painfully slow and mistake-prone in qwerty. One must regularly type at least a little on both layouts to maintain proficiency.

Q: So learning a new keyboard layout isn't really a “lifelong skill”, then? What is learned can be lost?

A: Okay, you got me again. However, I am indeed not recommending Dvorak, despite its good qualities. If all the world used Dvorak by default, then we would be better off and in possession of happier fingers. However, we don't, and there's no escaping that you can't fully switch to Dvorak, not really. You must maintain qwerty proficiency to get around in the real world, and this requires some extra effort. That effort, I think, will not be found to be worthwhile by most persons' standards.

Q: Fair enough. So why do you find the extra effort worthwhile?

A: To tell you the truth, I'm not sure. I do happen to spend a lot of time inputting streams of text, like when I type out blog posts and long emails to friends and family. The little extra speed, accuracy, and comfort that I get from Dvorak is worth it.

Q: How much is “little”?

A: Probably a few extra wpm, a few fewer mistakes per minute, and fairly considerable less pain in the fingers and wrists.

Q: “Less pain” sounds good.

A: Yes, that's not to be underrated. Typing on a Dvorak keyboard is nearly stress-free, for the fingers are making natural movements.

Q: Perhaps Dvorak is worth recommending to someone who just doesn't do well on a qwerty keyboard and maybe even has a lot of wrist and hand pain and has little to lose by attempting a switch?

A: That sounds like a pretty good idea. Also, I guess I might also make a qualified recommendation for the exceptionally curious.

Q: Does this have to do with calling the Switch a “nearly religious experience”?

A: You're exactly right. If you're proficient on qwerty and try to learn Dvorak, it's like starting all over with learning how to type. Only this time you have expectations on what the end result should be, what it's like to type well. So learning Dvorak provides a person with an opportunity to watch how their own brain adapts to a new environment, one where all the keys—except for ‘A’ and ‘M’—are switched around. It's like watching yourself from the outside.

Q: That sounds a little out there and not like my cup of tea. I suppose I understand why you don't recommend Dvorak in most cases.

A: Yes. I guess it's fair to say that I've fallen harder than most for the Dvorak Myth, which is the idea that floats around on the Internet that a mediocre qwerty typist will be transformed into an exceptional Dvorak typist, all because the keys are switched around a little bit. That's not so. Maybe I'm just following my own comfortable path of contrariness and of doing things a little differently than the others around me. Maybe I continue to be entranced by how using Dvorak in a qwerty world continues to allow me to introspect and use firsthand experience to speculate as to how our minds interface with the rest of our bodies and the world around us. Maybe Dvorak is little more than an outlet for being weird.

Q: Maybe we've explored this topic enough, and it's time to end it.

A: Yes, maybe so. Thanks for the questions.

Q: You're welcome. Thanks for the answers. Hey, aren't FAQs suppose to end with an answer and not a question?

Monday, November 8, 2010

Reason and evidence

Jack: Sbhe fpber naq frira lrnef ntb—oh, hello Edward!

Edward: Hello, Jack. What was that that you were saying?

Jack: What? Sbhe fpber naq frira lrnef ntb—I'm studying for my midterm tomorrow.

Edward: What a strange-sounding language! What is it?

Jack: ROT-13. I'm taking the class to satisfy my foreign language credit.

Edward: Oh, I see.

Jack:bhe sbersnguref

Edward: Well, I guess I should leave you be so that you can continue with your studying…

Jack: Okay, bye—wait, Edward!

Edward: Yes?

Jack: I've been meaning to tell you something.

Edward: In ROT-13? I'm sorry but I'm not at all fluent—

Jack: —No, no. It has nothing to do with ROT-13.

Edward: Avoiding one's study topic? I guess that's the point of cramming. What is it that you have to say?

Jack: What I have to say is that I think I've made important progress in figuring out how I should be living my life. As you've told me once or twice already, my previous ideas haven't always been so well thought out. But this time I sure I'm on the right path.

Edward: That's great to hear. What path are you on these days?

Jack: The path of reason and evidence. See, this is truly a path, a means, rather than an end in and of itself like those previous ideas of mine. Rather than jumping straight to a conclusion and being shown that the conclusion doesn't pass muster—once or twice—this time it's totally different. In my ROT-13 class—

Edward: I thought you said this had nothing to do with ROT-13.

Jack: A-ha! You got me there! But this is secondary to the main point.

Edward: Yes, yes, I was just joking. Please continue.

Jack: Sure. So, in my ROT-13 class, I met an interesting fellow who's majoring in a science of some sort, and he's taught me that what's important isn't the conclusion one reaches but the method one uses to reach conclusions.

Edward: And that method is reason and evidence?

Jack: Exactly! With reason and evidence, it's impossible to go wrong! Or at least it's really hard to go wrong. All I have to do is start with some premises based on evidence and then crank those premises through my machine of logical reasoning to conclude true statements. It's a beautiful system, and it doesn't even matter so much what the statements actually mean; as long as the premises match what is observed—evidence—then the conclusions—reason—are validated.

Edward: What if a premise turns out to be based on faulty evidence?

Jack: That's the best part! A premise can be chucked at any time and replaced with a new premise or set of premises or even nothing at all. The new resulting set of premises is re-cranked through the machine of logical reasoning to conclude new, possibly different true statements. Because the methodology is preserved, we simply take the new conclusions as the new truth and live our lives according to them.

Edward: I see. Isn't it a little bit awkward to have one's way of life disproved and changed because some new evidence turns up?

Jack: Sure, this may not be as convenient as doggedly staying the course, but such is the price one pays for living one's life according to principles that match what one actually observes in the world.

Edward: I see.

Jack: Great, you're already formulating your own premises!

Edward: Yes, um… So what have you concluded about living the good life? What true statements have you come up with so far?

Jack: Well, to answer your first question: not much. To answer your second question: not many. Remember it's the methodology that's most important. As for concluding specific statements, er, I'm still a little stuck in the premises stage. You see—

Edward: —I do—

Jack: —I haven't quite figured out the freewill issue using only evidence. But I'm sure once I figure that out all will fall neatly into place.

Edward: Hmm…

Jack: You don't think so?

Edward: Well, no—I mean, yes, er, maybe. It's just that I was thinking of a different question.

Jack: Another question? Sure, go ahead and ask. Since I haven't concluded anything yet, surely there's no way you're going to trap me this time!

Edward: Trap? Me, trap you? I only ask innocent questions!

Jack: The most dastardly traps of all are innocent questions.

Edward: Well, I suppose.

Jack: Go ahead and set your trap. Ask away!

Edward: Okay. I was thinking about this reason and evidence thing, and I was wondering: what would it be like to use reason and evidence alone?

Jack: It's like just as I told you. Why don't you try it? That didn't seem like a very difficult question…

Edward: That wasn't my real question, sorry. I was just setting up my real question.

Jack: Which is…?

Edward: I'm getting to it. Okay, so I was wondering: what would it be like to use reason and evidence alone? You're saying that you use reason and evidence alone and not reason and evidence and some other technique, too, right?

Jack: Right. Using anything more than reason and evidence would be unenlightened, indeed!

Edward: Indeed. So, if I were using reason and evidence alone—and nothing else—then it seems to me, being as how I'm using reason and evidence alone to justify my actions, that I should be able to use reason and evidence alone to justify using reason and evidence alone. And I was wondering: how would I use reason and evidence alone to justify using reason and evidence alone?

Jack: Um, well, I suppose… You could… Um…

[An awkward silence passes.]

Jack: Qnzzvg, Edward!

Thursday, November 4, 2010

Things I Believe Are True But Cannot Prove

  • There is no afterlife. When we die, that's it, we're done. Annihilated. This is the single hardest limitation for an individual to accept.

  • The idea that we forgive and accept not for others' benefit but for our own is an immensely practical one that enables us to live our lives better.

  • The socioeconomic makeup of the world is not really much different now than it has been for the last 5,000 years: about one-tenth of the world's population is privileged to consume more than their equal share of the planet's resources. The only real difference in modern times is that most of the world's privileged persons are isolated within a handful of nations rather than spread out all over the globe.

  • Most of the price of most goods and services is derived from the embodied energy in that good or service. Put another way, if some good or service, A, costs X and some other good or service, B, costs 2X, then B requires about twice as much energy to produce as does A. If you want to decrease the total amount of energy you consume, the simplest way to do this is to reduce your cost of living.

  • P != NP.

  • Discriminate use of “goto” in C is safer than the indiscriminate use of “break” and “continue” and often even the discriminate use of them.

  • The Internet doesn't pay for itself once you factor in the cost of its externalities, such as pollution and the consumption of non-renewable resources.

  • People don't choose to disagree, argue, bicker, and fight; these are compulsive actions. What we choose is how well we conduct ourselves when we do disagree, argue, bicker, and fight.

  • People are, in general, lonelier now than before the advent of cell phones and computers and other prosthetic ears and mouths.

  • That “commoners” are increasingly opinionated about party-line politics isn't good for anyone. This isn't because people are wrong in their opinions but because there is a wide range of good and noble pursuits in all of life, and party-line politics shouldn't be one of them for but a few people.

  • The biggest conspiracy is that no one is in control. The world is 1,001 special interests each holding the tiger by its tail. This is the scariest conspiracy of all because it means that the future is subject to no one and nothing save chaos.

  • The quality of American middle-class life has been on the decline since before I was born, but our increasing material wealth has distracted us from this trend. This is a big reason why recessions have the potential to hurt so much: when the distraction goes away, even just temporarily, it reveals the increasing poverty of our lives.

Monday, November 1, 2010

Balance, pt. 2

This previous weekend, Laura and I each hauled a load of stuff to donate to the newly opened Goodwill store within walking distance from our apartment complex. My load entailed a suitcase, which I haven't used in a few years, filled with plasticwear and bicycle jerseys I don't use anymore. Laura's load entailed a large grocery bag full of junk soon to become someone else's treasure.

Getting rid of stuff is not hard for me. Not usually. I don't like clutter, and I realized soon after college after I made that trade that so many of us make by giving away my time in exchange for money, that in our throwaway, replace-anything civilization, it's better to error on the side of discarding something than on the side of keeping. I think a lot of people miss learning this lesson because it's easy to overestimate the cost of obtaining and underestimate the cost of maintaining.

But getting rid of a suitcase full of junk was the easy thing I gave up last weekend. I gave up something else that, even though I've come to expect giving it up each autumn, is never easy to part with: bicycle fitness. This is a personal lesson in balance.

Despite not having any bicycle or triathlon race event planned for at least a few months, I've continued training hard these last few months and am in a great bicycling form right now. Recently, I clobbered my PR by a minute for ascending South Mountain; this sort of gain is suppose to be unobtainable for a non-beginner, and it marks a real high point. Laura calls this sort of talk “bragging”, but I think it's more accurate to call it a “factual statement of awesomeness”. Admittedly, though, “awesomeness” is indeed an exaggeration; I occasionally ride with guys who are awesome, and even they are far below the level necessary to become a no-name pro. In cycling, as with most sports, there's a tremendous gap between above average and elite.

That's the kind of lesson I try to keep in mind to put things into the proper perspective, because counter to the realization that I'm only one totem higher midway up the pole than I usually am, there's a visceral joy that stems from doing well in a sport, and that joy can be blinding. These days that joy is getting in the way of some goals I've established for myself this winter, including another fitness goal of running more as well as some non-fitness ones, such as completing some simple construction projects.

The strength of bicycling is also its weakness: a tremendous amount of time can be devoted to it. My before-work rides on Tuesday and Thursday mornings are a tad over 2½ hours each. My “long” ride on Saturday morning is almost double that. This sort of schedule will force nearly anyone into good form. That's the good thing. It's also the bad thing because once you're in good form, it's hard to give it up, even when there's no longer any reason to maintain it because, to reiterate that aforementioned tosser-outer maxim, it's easy to overestimate the cost of obtaining and underestimate the cost of maintaining. This applies to fitness just as much as it applies to household junk being hauled off for donation.

If you are an elite cyclist, then you are exempt. Winter is the time that you put in base miles, which basically means long, low-effort rides that aim to prepare the mind and body for another season of hard training the next year. Though I'm not elite, I did exactly this sort of training last winter, and my firsthand observation is that it works. It sets you up to become strong. There's no substitute for a long ride and how it conditions the body to dig deep into its reserves. But there's also no substitute for having a real life. Even here in the Valley of the Sun, where there is no true winter, the days shorten and to one acclimated to the extreme summer heat, it gets cold enough. I especially feel the lure of winter's late sunrises, when, minus the artificial lighting in our homes and streets and everywhere else, all of nature seems to be suggesting to each of us to go to bed a little earlier, wake up a little later, and to put in “base miles” in our real life by focusing on our indoor pursuits and the people we're close to. There's no way around that this means, for me, giving up to some degree on a hard-earned level of fitness, but I remind myself that just because something is hard-earned does not mean it's worth hanging onto. I think this is part of what's entailed in striving for balance.