The state publishes a PDF containing the 2012 California Dept of Transportation Bicycle-Related Roadway Standards. It contained, across several pages, all of the current standards for bicycle-related signage (and much more). I've collected the signs together here for easier web reference (fabric patterns, anyone?).
Note that the vague legacy "Share the Road" sign seems to be fading out, to be replaced by the much clearer "MAY USE FULL LANE." About time!
...up to a point.
A couple of years back the late, great Hitch appeared on FOX and made some rather pointed comments concerning Mitt Romney and the painful history of the Mormon/LDS church on the subject of race (in the video below up to about 1 minute in -- I'd appreciate a better link if one exists, without the extra passage from Richard Dawkins).
...which can seem pretty damaging, especially when paired with quotes like this one from Brigham Young himself in 1855: "You must not think, from what I say, that I am opposed to slavery. No! The negro is damned, and is to serve his master till God chooses to remove the curse of Ham..."
These issues are not a new political weapon -- they were a rather large factor in the political fizzling of the non-Romneyesque Mormon presidential candidate Mo Udall, who was narrowly passed-over for the Democratic nomination of favor of Jimmy Carter. Shortly after that election, a new revelation was received by the church that re-instated black people into the full range of the faith (closer to the original teachings of Joseph Smith, who had ordained blacks himself).
Yet Hitchens provocatively reminds us: Romney was already a grown man and a major figure in the then-officially-racist church by that time. Shouldn't Romney still be held accountable? A few journalists took up Hitchens's challenge, as you can see here on Meet the Press:
I'm willing to give Romney the benefit of any doubt and say that his response is 100% genuine. That his father supported Dr. King is easy to verify (though he never 'marched' with King), that the change in church dogma moved him to tears of thanks, and that Mitt (and Udall) opposed this church policy -- it doesn't surprise me and frankly it's about the most human moment for Romney on television that I can recall.
But does that mean Romney is out of the ethical woods? Sorry, no.
Besides not being a racist, it means: Romney has a moral sense that is not guided by church dogma or senior church authorities. Most people do -- it's why everyday American Christians (and allied faithful) don't, say, stone adulterers at Club Med or burn witches hanging out at Burning Man.
And there's the rub. It Romney says his choices are guided by faith, and not open to discussion, then... shouldn't Romney himself be held to the same criteria? Because clearly he, and his father, and the Udalls, and many many other politicians, are clearly answering an internal moral voice that's not guided by the scriptures or elders of their churches.
And here's where Hitchens show up again on an end-run from 2007.
A black candidate with ties to Louis Farrakhan could expect questions about his faith in the existence of the mad scientist Yakub, creator of the white race, or in the orbiting mother ship visited by the head of the Nation of Islam. What gives Romney an exemption?
The answer should be: nothing. I'm appalled at recent statements by his political opponents that the sources of Romney's views (and actions, should he managed to be elected) would be off-limits for discussion. You should be, too -- especially when other devout believers of the same stripe have such strongly-opposing views, like the laudable Senator Tom Udall, nephew of Mo, who is determined to overturn the Citizens United ruling.
WWJD? Can a corporation be ordained, or not?
It strikes me that designers themselves provide ample evidence that "Intelligent Design" is a fundamentally flawed idea.
"I.D." has at its core the same premise as Paley's "watchmaker thesis" as presented in his 1802 Natural Theology:
In crossing a heath, suppose I pitched my foot against a stone, and were asked how the stone came to be there; I might possibly answer, that, for anything I knew to the contrary, it had lain there forever: nor would it perhaps be very easy to show the absurdity of this answer. But suppose I had found a watch upon the ground, and it should be inquired how the watch happened to be in that place; I should hardly think of the answer I had before given, that for anything I knew, the watch might have always been there. (...) There must have existed, at some time, and at some place or other, an artificer or artificers, who formed [the watch] for the purpose which we find it actually to answer; who comprehended its construction, and designed its use. (...) Every indication of contrivance, every manifestation of design, which existed in the watch, exists in the works of nature; with the difference, on the side of nature, of being greater or more, and that in a degree which exceeds all computation.
An obvious question might be: was the watch left there deliberately? Already, the notion of finding the watch on th heath implies not a deliberate chain of cause and effect, but an accidental one. Even if it was left there with a specific intent, was that intent that Paley find it? Or by coming along with his mind keen on deciding on what things mean before looking at them, has Paley already mucked things up for the watch's actual owner?
But my point here isn't to shoot down Paley's easily-pierced balloon specifically. Rather, it's to point out that while engineering and technology has been incredibly influential, it seems as if that influence has almost never really gone according to design.
Twenty years before Paley's book, the Montgolfier brothers launched the first success of a long line of flying inventions. Spurred on with ideas about how to assault the seemingly-impregnable fortress of Gibraltar, their work went on a path that's easily traced from their sheep, duck, and rooster flights through the first human flight (with a military officer on board, Marquis François Laurent d'Arlandes), to the daily aerial firebombing of Asian and European cities a century and a half later. One could say, in fact, that everything went quite according to horrible plan.
Or... did it? More than two centuries later, Gibraltar remains under a British governor.
The watch found on Paley's imaginary walk -- did it perhaps incorporate that contemporary technological marvel, the lever escapement within a Abraham-Louis Bréguet-designed tourbillon? While impressive, and enough to merit Bréguet a commission from the French Navy (who used his clocks extensively in navigation), there's just one problem -- it doesn't work.
Oh yes a fine watch will tell the time. But the tourbillon, designed to counter the effects of shifting gravity (perfect for a naval vessel)? Ooops! In general, modern horologists not only say that not only is it unnecessary in modern timepieces (except to enhance sales), it probably never was, and that watches that lacked a tourbillon were never less accurate than those that did.
And as for a "master" designer? Even here what potential confluences do we have here between aviation and watches (much less the grand clockwork of the universe), except maybe in the coincidental affinity of Kim-Jong Il for Rolexes, even as his government's rockets put pressure on the doomsday clock?
At this point everyone in America and certainly most people in English-speaking countries are aware of last fall's election success of Proposition 8, by which a majority of California voters banned non-pairing non-heterosexual marriage in the state. At its core it opens the curtain not only on the single issue trumpeted on the TV networks (gay marriage), but several: with the toughest, most difficult being those relating to what constitutes a "right" and what status "rights" have against that other favorite human institution: God.
This week the California Supreme Court got an earful of testimony regarding the specific bureaucratic issues relating to the constitutionality of Prop 8. A quick check of the wikipedia entry seems more concerned with celebrities and scandal than core issues. In fact, it's decidedly hard, from the wikipedia article, to actually find the brief text of the "Protection Act":
- SECTION I. Title
- This measure shall be known and may be cited as the "California Marriage Protection Act."
- SECTION 2. Article I.
- Section 7.5 is added to the California Constitution. to read:
Sec. 7.5. Only marriage between a man and a woman is valid or recognized in California.
Anti-Prop 8 attorney Therese Stewart puts forward this important question: Can a simple majority use the ballot initiative to take fundamental rights away from historically disfavored minorities? In the past, the California Supreme Court has already defined LGBT people as a protected minority:
...When the court ruled that LGBT people constitute a protected minority, it put anti-gay provisions on the same constitutional footing as provisions that single out people of color, women, or religious minorities; and when it ruled that relegating same-sex couples to domestic partnership instead of civil marriage violated their fundamental right of dignity, the court recognized that the humanity and the equality of LGBT people are inextricably linked.
i wonder if the (largely) religiously-driven campaigns in support of Prop 8 are creating a dangerous precedent for themselves. Consider the standard U.S. government wording relating to the description of groups who are protected from illegal discrimination, here grabbed from Title VII: "race, color, religion, sex, or national origin."
To my mind, one of these things is not like the others. While race, color, sex, and national origin are innate properties of an individual, properties which were not chosen by that individual and not open to revision, religion is not. There is very clearly no objective case for religion as part of biology or an otherwise innate property of a person. Any christian can declare themselves muslim, any muslim declare themselves scientologist, any scientologist declare themselves tired of the exercise and become a comfortably non-practicing agnostic. It is a choice, a matter of opinion and behavior -- not a fundamental property of the person.
Crucial to the core of religious intolerance toward homosexuality is the assertion that homosexuality is a sin -- and of course the very nature of sin is that humans make choices which are contrary to the rules set by God (or so the theory goes). If homosexuality is behavior, and behaviors are not protected as rights, then religious advocates are setting themselves up by saying that there is some more basic principle that divides rights from privileges.
Worse yet, for the faithful, is that research increasingly indicates that homosexuality is just the part of the individual's biology, that it is an innate part of their person. At which point, if homosexuality is simply part of the description of gender, then Title VII's protection (along with that of other statues) relating to "sex" kicks in, and with it a host of legal challenges can be made, with healthy precedents regarding gender rights issues, to ensure the equal civil rights of LBGT people (and such implicit rights would likewise function against the notion, already proposed by some christian evangelicals, that "pre-natal anti-gay treatment" should be used (without dealing with the probably more difficult question of "what sort of prenatal treatment do you think would work?") -- after all, if gender is protected, then doctors would have no rights to alter or suppress it, given that the same fundamentalist demographic also asserts that embryos are persons with rights. Or would they be willing to forgo that assertion for the sake of their homophobia?).
Of course, one can rarely find rational, objective basis in fields like government and religion. Religion in particular operates explicitly on the principle that it is not subject to rational discourse or to any requirement for objectivity (which includes, of course, all questions about biological bases and similar medical or scientific "obfuscations"). Which is fine -- as opinions are -- as long as it stays away from attempts to assert these unfounded (err, carefully designed to be unfoundable!) ideas up as the (legal) basis of objective behavior and restrictions to be imposed on everyone else.
This is in itself a paradox: if God is all-powerful, then why do humans need to be enforcing God's opinion? If homosexuality is a sin, don't they believe God will sort it out and punish (or reward) the right people at the right time? Or is being trapped in a world full of irrational paranoia punishment enough?
In the film Repo Man, Tracy Walter's character opines: "The more you drive, the less intelligent you are," a line I've glibly repeated ever since hearing it.
That line has always felt true, and the central kernel of it is this: intelligence doesn't really enter into it. No matter how intelligent, no matter how fit, no matter how wealthy, no matter how experienced, no matter how good you could be, you simply won't be. Michael Schumacher has essentially no advantage whatsoever in commuting when compared, say, to a semi-paralysed 87-year-old illiterate who forgot to bring her glasses. A $400K Mercedes has no operational advantage over a rattling secondhand Kia in over in 95% of real traffic. They will all arrive at the same time: late.
Short of hiring a driver (or a helicopter pilot), there's little to be done about it. At least one can try to use the time that highway commuting wastes, as I do with podcasted lectures and the like, but you need to do it at the expense of both reduced safety (attention distracted by learning) and reduced learning (attention distracted by a non-signaling white pickup suddenly veering left through the three lanes in front of you). No wonder idiots like Howard Stern, Rush Limbaugh, and their ilk can dominate radio: their listeners are strapped and buckled into their chairs, and intelligent consideration is a freeway hazard. It's a motorized version of the Ludovico technique.
Year after year Moore's Law accelerates the information superhighway by a sizable margin; year after year we have quicker computers and related circuits to drive our businesses and media, yet every year the nearly-straight slice of Highway 101 between the offices of Intel and Oracle gets slower, more choked, more dangerous. It's astounding to think how many people willingly burn an hour or two of every day having to deal with mortal terror -- not in an abstract way, but dealing with traffic jams and seeing the ambulances on a regular basis.
It's a perpetual puzzle to me. Cars kill us on the street, they choke us, isolate us, they cloak the planet in hot carbon, drown our cities, and yet we still obsess over them. News reports on global warming run side-by-side with "most popular story" links on "top cars of 2009." The rack of car magazines at Borders is even larger than that for fashion rags.
Repo Man was right -- and worse, the more each of us drives, the less intelligent we all are, collectively. In simple economic terms this is a gigantic drag on the social and financial state of the world. Why, exactly, are so people many willing to toss themselves into debt for the latest hulking SUV, and governments & industry so timid about even suggesting alternatives?
...as if on cue, I received a letter from Schmap saying that they are planning to use one of my photographs in their (allegedly non-commercial but obviously ad-revenue-driven) travel guides.
Not particularly surprising to me, the photo was not marked with a Creative Commons tag, but rather as "© All Rights Reserved." Maybe they don't realize that I habitually send CDs of thumbnails to the Copyright Office to actually register my pix?
As a followup to the earlier post on skepticism about "Creative Commons," it's been sadly amusing to watch the recent flaps declaring flickr (a) as censors but (b) not censorious enough. What seems common to both situations is a failure of common sense, a failure rendered raw with typically abrasive flourish by EPUK's "Sqweegee" in his article on the Schmap smokeup:
Flickr is a mashup of hobbyists who merely want to share snaps of kittens and sunsets and rather a lot of more serious photographers who covertly dream of dumping the day job and becoming pros someday. For now, all are content to share for free, but the expectation is that enough exposure and recognition should eventually lead to fees, fame and stardom if you are good enough.
This is of course romantic rubbish : there really are no clear demarcation lines between pros and amateurs anymore except an insistence on being paid that is being rendered untenable by oversupply. "Pro" means "makes a living." Every aspirant pro who gives away their work "for exposure" undercuts their own future by demonstrating to clients that they need not pay for work they consider good enough to use. So they never will.
Now we all know: that Creative Commons licenses are not meant to cause harm, or so their proponents remind us. They are meant as a shining pathway to an ideal Republic of pure creativity and form based on freedom and love where everyone with a laptop and a wifi connect can be their very own personalized Philosopher King. And get rich.
No, the concept can't be flawed, it makes perfect sense to create copyrights where there's no control over the copying rights. Instead, there must be some conspiracy, some very bad people who have been using it without having their hearts pre-aligned according to our approved rules. So no, let's not look at the fundamentally pig-dumb notions of Creative Commons: let's make a right turn and find a new set of different problems that won't make us look bad.
Is the potential for corruption in government and enterprise a deep one, and one that causes plenty of real problems for people every day? Yes. And it's one that I too care deeply about. But... what? This will get fixed by the wiki/CC crowd? It's their very ignorance (or deliberate glossing-over) of the ability of people to be guided by their own self-interest (particularly in this case, the self-interest of people who realize that it's cheaper to steal images and ideas than it is to create them) that has made Creative Commons such a social disaster. And now the same people claim that they're out to somehow fix the general problem of corruption?
Just give them ten years before requiring any further statements. Yeah, that's the idea. Should land all the CC culprits right into tenured retirement without having to have any further pesky demands for a "pre-baked" "revolution."
As the Schmap folks wrote in response to outrage at their broad appropriation of images and further use of them as a means for advertising Schmap's products right back at and through the hoodwinked flickrites: "We'll do our best to stay the right side of the line throughout all this."
Which line is that? I suspect the bottom one.
A couple of days back I was part of an informal web conversation about "Creative Commons" copyrights (spurred by a publisher who had grabbed CC-marked images off the web and republished them for a profit without clear notification to the owner). A predictable pro-CC argument came up: that somehow using a CC notice, rather than the traditionally-restrictive "All Rights Reserved," would encourage the publication of images from artists who might otherwise never get a venue (The takeaway: one should be honored when their otherwise-unknown creations have been found worth stealing?).
Similarly, last week I finally got to listen to the entire "Google Print Panel" session at the Commonwealth Club of California, in which it was suggested that copyright-free publishing was crucial to get "those little books that every publisher had rejected but that just might become the Next Big Thing" out to the public.
They both seem like the same argument, both using a straw man. Maybe someone will create some amazing but unrecognized work, and maybe open publishing would help that one person, but does this speculation really justify the wholesale dismantling of the entire copyright portfolio for all other authors? Maybe it's a convenient, publisher-coddling fiction. If anything, it seems to me that with the ease of modern web publishing, those independant and small creators are actually the ones who need a full suite of copyright protection, not the mega-corporations.
CC and "open publishing," from my view, offer creators nothing, other than reduced paychecks. They are not about "enabling" creativity, but rather de-valuing it for the sake of publishers (including web-portal publishers like Google/Yahoo/MS, who as far as I can tell have largely co-opted the notion of "web sharing" by creating behemoth databases of "contributed" data which they in turn own an ownership which is the source of their great wealth) publishers who continue to charge the same amounts for their products containing images and other works that they haven't paid for, benefiting from the protections of intellectual property laws that they've convinced their suppliers to abandon.
It's just economics of scale, we like to tell ourselves. On my last rushed morning in Boston I set off to seek out a few extra gifts for my return to the bay, something unique and memorable. Instead as I orbitted in increasing circles through downtown it was clear that I was less in Massachussetts and more in the United State of Generica, with every storefront filled with familiar brands and stocked with goods in no way different from those in Santana Row or The Great Mall. In the end I settled for some team shirts, sold every fifty feet throughout the mall under the Prudential tower and all supplied by... a centralized web site in Washington State.
Okay, the big guys succeed on costs and inevitably the little guys end up either losing on price or having to carve out a space based on tradition, design, or quality... right?
Then I got home and found this Metro story waiting un-read on my kitchen table: how Stanford Coffee Roasters, a small, popular, successful and long-established business in Stanford's tony mall, was pushed out not, apparently, for any economic reasons, but because the mall owners prefered homogeneity.
"I was told that they preferred to rent to a Triple A tenant," she recalls. "I said, What is a Triple A tenant? They told me it was a business that would have a national chain and national marketing."
In other words, the mall managers prefered Starbucks over the successful Stanford (in Stanford) not for the rent values but simply because they felt that national chains were inherently superior to anything that might have a local flavor (literally or figuratively) so superior that it was worth crowbarring-out a local landmark for the sake of enforcing dullness.
Note to self: never shop at Stanford Mall again.
When people are free to do as they please, they usually imitate each other. Eric Hoffer
This is one of those never-quite-finished entries that's been long-lingering due to lack of time and attention in this case it's been months (there are some that are older... what can I say?) the original save date was early 2005, and it's lingered in "draft mode" ever since.
I was digging around on del.icio.us one afternoon and saw that after a lot of web traffic and game-industry furor back in November 2004, ea_spouse was still drawing hits from across the del.icio.us spectrum.
Now, I know a lot of folks at EA, I deal with them and know that a lot of them are happy, that they keep moving on from one project to the next, from group to group, production to production, and they're doing fine. If the general picture were really as bleak as ea_spouse paints, then I doubt very much that anyone would work there for more than a few months. And that's not the case plenty of people at EA have been there, happy, for years.
Just the same, there can be problems in the industry. I regularly see people griping on message boards, usually about pay, hours, and credits. I read the recent article on IGDA purporting to be lessons for EA managers (and game managers in general), & I felt that the authors had gotten a few things wrong.
I don't think this is really an EA issue at its core maybe "ea_spouse" hails from there, but EA is just a big, easy target for journalists. Rather, it's an industry-wide issue. There are many other companies, better and worse alike...
The games industry, spawned from the general computer graphics and animation industries, is by its nature dynamic and changeable. More than that at least in its earlier days, it was mutable. It was really possible for a very small number of people sometimes just one dedicated and brilliant person to create things that would radically change the industry or create whole new genres. When Carmack & co made the Doom engine, that small group changed everything. When Blinn simplified the Cooke-Torrance illumination model, it changed everything. Toy Story? Photoshop? GeForce 256? Jurassic Park? Every one a work created by some small group of smart dynamos that blew-open the prevailing notions about what was possible.
(Folks like Clayton Christensen & Tom Peters might argue that it still is mutable we just don't see how yet)
When you're hot on the trail of a Big Idea, time is immaterial. You do it out of love for craft, love of what you're doing, love of the idea that you're out to change the world. It's not a job, it's What You Do, it's a part of who you are and who you're becoming, what you're discovering and dreaming. That's true either at the office or out, depending on the person. For the people spending 50 hours a week after work carefully airbrushing the side of their van, for someone carefully tweaking their website scripts every other day, or someone sure that they can just get the rig on that character just so because it's never been done that way before.
The beauty of Big Ideas is that no one complains about them those hundred-hour weeks spent on finding a way to get cool reflections aren't a waste, they're a part of life's flow. The nasty part is, sometimes those Big Ideas make money. Sometimes they make a lot of money. And then the trouble starts.
The business-statistics sources IGDA quotes aren't inherently bad, they're good sources. A lot of the same sources can be found in books like Ed Yourdon's Death March or even systems books like Deitrich Dorner's The Logic of Failure. Probably my favorite of all such resources is on the web: The Software Program Manager's Network is a great source on project management, risk assessment and management (and most of it has been already paid-for by our US tax dollars, so its resources are free to all).
The problem is that such sources don't see the surrounding business environment as highly dynamic rather, they see business as stable, as a broad zero-sum or low-growth background. If your business is expanding at a rate of 50% per year and bad management is costing you 12% productivity per year, well, guess what? You made money anyway. And what's more, human nature being human nature, there's more than a slight chance that those bad managers will be convinced that the reason they made money was because of their own sage and masterful leadership.
In a slower, more-stable business environment with 5% or 10% growth (or less), that 12% drop will get a manager fired. In a boom market, it won't even be noticed. It may seem counter-intuitive, but successful innovation is an excellent breeding ground for bad managers. Good managers can exist there too a good manager in a boom market will do better than a bad one, but the bad one has a good chance of survival even encouragment simply from the dumb luck of being in the right place at the right time. Anywhere else, they'd be dead.
Second, honey and money attract flies. When you look at the growth of many industries and niches, you'll see that once money appears, the companies will swell not only with the core creators and producers of products, but with layers of bureacrats who really had little or no knowledge of the XYZ business before they realized that Forbes had identified it as growing. I've heard animation producers say they saw no difference between making films and making cardboard boxes. They were only there for the money, and had no interest in anything like Core Values or purpose.
So now imagine, the stage is set: the innovation of XYZ, based on a Big Idea that its creators were pursuing out of the love of doing it, has suddenly produced great results, and the money is flowing in. The structural layers of the companies are swelling with people who weren't really part of the XYZ innovation in the first place. The XYZ business is slow and project-oriented you work for years and you either win big or you lose big (large projects are common in many innovation businesses: cel phones, gaming, biotech...). The people who lose big go away, while the people who win big well, it must have been the great accounting and production liason lunches that made everything so successful, right? Because the cycle of the entire enterprise is such that, by the time a game or film is released and the money is being counted to determine whether the project was successful or a failure, the creators are no longer part of the process. Bean-counting, yes. But animation? Design? Modelling? Research? That was done months ago, maybe years. Those people are working on a completely different project, potentially at some completely different company (back in November 2004 it was claimed EA had something like a 50% turnover annually). In the meantime, the bean counters are patting each other's backs for work well-done.
(Illustration in point: some years ago I was working on a TV commercial and needed to check some client paperwork. I couldn't find my producer or his assistant. After a while I discovered that they were at an afternoon party, because another commercial I'd done some months earlier modeled, textured, animated, lit, rendered, and delivered had won an animation award. Despite my name on the award, the agency and producers were busy having a party without even bothering to tell me eventually one of the agency guys thought it would be fun to come back to the production office and visit, which was the moment at which I discovered that the award even existed and had been won: when someone I'd never seen or heard of before came by to say thanks for all the great work "we'd done together." Two years later, I won the same annual award, and found out about it because a colleague had attended the ceremony the previous evening, where they saw an exec I'd never heard of accept the award in my name even though I was working only a few blocks away, no one had bothered to tell me. I would attribute these sorts of stories (and I have plenty) to individual people, rather than a broad pattern if it weren't for the fact that they happened with completely separate companies at opposite ends of the country)
So there is a confluence of forces, between two groups with very different expectations:
(To be continued when I'm less cranky, heh)
Measured in "blog years" perhaps I haven't posted for a while, but it's good to keep it in perspective. Consider the timeline above, for instance, which describes part of our relationship to foods.
Each horizontal pixel in the timeline represents 162.5 years, and I've only run it back as far as the advent of modern humans people who are physically the same as you or I. The timeline could have gone further for example, to the beginnings of fire and cooking, which vary in estimation from 500,000 to 1.7 million years ago anywhere from five to fifteen times as long as the current chart (or 2125 years per pixel, which would place the advent of agriculture a mere four pixels from the end of the timeline).
As you can see, the "classic staples" of our diets breads, cultivated fruits, domesticated animals, imported foods and spices didn't really show up until fairly recently, in that last portion of the timeline. Our bodies (including the brains), which have been evolving towards their own maximized ecological niche for a very long time, were developed and balanced for a well-defined and pretty consistent diet long before agriculture cropped up.
Almost everything we're familiar with today, eating-wise, fits into that very last pixel on the right. The past 162.5 years have seen the adoption of canned foods, refrigeration, aluminum foil, mechanized harvesting, supermarkets, plastic wrap, artificial flavors and preservatives, and this one is key corporate production and marketing of most everything we put in our mouths.
If we set aside questions of how to maximize convenience, or how to best-enrich the coffers of corporate food producers, what should we eat? The bodies that we have are still designed to make ther best use of the foods and environment in which they evolved. Primitive man may have had a life expectancy of 30 years, but when you remove the effects of predation (so far, none of my immediate relatives have been eaten by leopards) and severe infant mortality, the differences begin to dissolve. The human body was born eating simple foods, eating sweets only when in season, not being exposed to concocted flavors or near-infinite supplies of processed and addictive grains. The fossil record itself shows a shift when grains (which are relatively high in silicon and other minerals from the earth they've grown in) were added to the human diet the teeth of the fossils at that point in prehistory suddenly become worn, ground-away.
Not all modern foods need to be harmful, of course. But it's good to be aware of what they are, and to always be aware that the interests of your health are only marginally connected to the (financial) interests of those who feed you.
For myself, I'm sticking to foods not far from early man's tree at least 95% of the time anyway (exceptions mostly focus around social circumstances, such as always accepting hospitality another core human trait is the ability to share!). The result is pretty-much a copy of modern low-carb fare: meat, eggs, veggies, only a small number of sweet foods, avoidance of refined flours and sugars. I don't need to run from predators, but I give my elliptical machine a regular spin and inflict painful crunches on my abs. I take some modern advice by using cholesterol-free egg products, and keep just a few lingering industrialized vices: coffee, wine, and diet coke.
The net results after the past year or two of this eating philosophy: my doctor says I'm healthy, I feel great and energized, and I'm back to the clothing sizes I enjoyed in my youth quite literally, I've recently found my best source of bluejeans is Gap Kids. Thank you, Alley Oop.
I got the new National Geographic yesterday and the lead article was simply banner-titled: "Love." The text was about the neurochemistry of attraction and attachment, with connecting photos by Jodi Cobb. I have to admit that I was a bit puzzled on (a) what made this a NatGeo story (yeah, yeah, an easy one: to sell magazines. But why NatGeo instead of Cosmo then? Where's the market differentation?), and (b) why they sent Cobb out on this one, since the resultant photos of affection are, well, charming but hard to point as particularly specific to the topics in the article, and why would you need to fly around the world to get them? Anyway, to the Real Meat of this entry:
The article didn't cover any particularly new territory (it is a popular magazine), but it had some useful references on something I've been suspecting for a few years now (also not news for people who know more about neurology than I do): a potentially-harmful connection between selective serotonin re-uptake inhibitor antidepressants like Paxil and the ability to create, maintain, and enjoy attachments.
Attraction is known to be connected to the neurotransmitter serotonin in some ill patients and in people feeling in love, serotonin levels are low. SSRI medications like Paxil and Prozac are designed to increase serotonin levels the patient is less likely to feel depressed, but also loses their interest in sexual attraction and attachment.
These meds are among the most common presecribed today. Millions of people are on them.
Digging further I found these presentations on MedScape which describe a whole medical symposium held on this during 2005, titled "Sex, Sexuality, and Serotonin."
Helen Fisher's opening presentation at that link is a good reference because it lays out in detail a lot of medical understanding about the mechanisms of attraction and attachment, before getting to her concerns about anti-depressants (the other presentations are good too).
Now, if someone is very ill and needs to, say, avoid suicide because they are deeply depressed, well by all means they should be helped. But these side-effects expose a somewhat undefined element in our current notions about "health care" namely, that they focus almost entirely on the individual "patient," without concern or even much awareness of the potential interaction effects between the patient and associated non-patients.
If a depressed patient is depressed becasue she's beaten at home, will Paxil solve her problem? Or if a patient has BPD or a related personality disorder, should their insurance cover the psychological harm they will inevitably inflict on the people in their families and workplaces? These are difficult questions that are largely left unasked and thus unresolved (perhaps because good answers to them may not benefit insurance companies).
Killing the ability to experience attraction suppresses not only a lot of the things we as humans consider "fun" but ultimately also the actions we consider at core of life's highest purposes attachments, commitments, feelings of connection between one another and the universe. You can consider these aspects of a higher calling or just a natural expression of the presence of the appropriate amounts of dopamine; either way they are key to our well-being as human beings, or ability and sense of achievement, and the well-being of the other human beings around us.
Verifies my suspicion that if people in most any relationship were just more willing to generate a little oxytocin, even if they didn't feel like it, they'd feel better and their relationships would be stronger. But that starts to sound more like bar talk :)
I've had a few folks send me some tangentially-related Dilbert cartoons about a "never been cubicled" photographer...
A couple of days ago I was having a coffee-shop stop and spied a travel book. At that moment, I realized something important:
If the letter "F" were prepended to Ireland, then it would be Fireland. Which would be great, because they would be the closest country to Iceland. Then, they could have a war!
It would be so cool. Err, hot.
One of my favorite work-related blogs, one which I read very deliberately, is Garr Reynolds's Presentation Zen. In a recent entry, Garr writes about the recent Nobel Prize award lecture given by Harold Pinter.
Like most people casually interested in the Nobels, I sat down and read the transcript of Pinter's lecture
which uses the context of his plays as a bridge through to the principles of political theatre and then into a biting and at times terrifying polemic against modern US and British foreign policies.
At one point he nominates himself as a speechwriter for President Bush, and while reading one can almost hear Bush's voice in Pinter's parody:
God is good. God is great. God is good. My God is good. Bin Laden's God is bad. His is a bad God. Saddam's God was bad, except he didn't have one. He was a barbarian. We are not barbarians. We don't chop people's heads off. We believe in freedom. So does God. I am not a barbarian. I am the democratically elected leader of a freedom-loving democracy. We are a compassionate society. We give compassionate electrocution and compassionate lethal injection. We are a great nation. I am not a dictator. He is. I am not a barbarian. He is. And he is. They all are. I possess moral authority. You see this fist? This is my moral authority. And don't you forget it.
Pinter's lecture on literature covers much political ground, and little of it seems much directed at literature but the economy and directness of his language reveals the greatness of his authorship, and he even goes so far as to include a poem by Pablo Neruda and to all but close his lecture with a poem of his own.
That same evening, only two nights ago, a journalist friend & I went to see George Clooney's Good Night and Good Luck, a dramatization of how Edward R. Murrow's most famous hour, his broadcast against the Red Scare, "A Report on Senator Joseph R. McCarthy." In that program, Murrow sagely used McCarthy's own words, brought to history by the indelible recording of film and tape. And as Murrow closes the program:
Earlier, the Senator asked, "Upon what meat does this, our Caesar, feed?" Had he looked three lines earlier in Shakespeare's Caesar, he would have found this line, which is not altogether inappropriate: "The fault, dear Brutus, is not in our stars, but in ourselves."
What TV journalist today, I asked my friend, would be so fluid with language and eloquent in their reporting as Murrow was? What modern TV journalist would not shy away from using a source like Shakespeare, fearing that its presence would marginalize their words as "overintellectual" and "elitist"? And yet, Murrow's language spare, direct, well-chosen words was exactly the sort of language that was best-able to be the sharp pin for McCarthy's balloon.
McCarthy's later appearance on TV himself using his same sorts of now-ineffectual innuendo and ad hominem vitriol that had previously given him so much credibility in the minds of an anxious public ultimately was the true sign of his defeat.
The buzz on Presentation Zen was not about Pinter's words, or the subjects of his attacks (well, not entirely). Rather they were set aside for the interest shared with the blog, that of presentation. So I called up the video of Pinter's presentation, which I had been assured was even more effective than the speech.
I had expected this. Pinter's plays often leave so much to the actor, to the reader. The words themselves spare, the performances breathtaking. And I was keen to hear his delivery of Neruda's poem, and of his own. Poetry great poetry is always best aloud, with a speaker to breath their life into it. So I clicked "play" and spent the next 46 minutes held captive, wanting to remove the headphones but not daring to.
I had already read the script. I knew every word that Pinter would say. And yet his delivery, carefully crafted by an actor and a master playwright, presented through video editing in as spare a manner as any minimalist play, was as riveting a performance as any I am ever likely to see.
A common theme on Presentation Zen in recent months has been the idea of presenting naked without the bells and whistles and complicated visual noise. Here is Pinter, enthusiastically raw with his message, with the Nobel in his grasp, swinging it like a hammer at the things he finds most horrible and difficult in the world.
Murrow was no Harold Pinter. And likewise Pinter is no Murrow. Murrow ended his telecast on McCarthy with a monologue that is both accusing and yet hopeful, a call for good people to arise:
This is no time for men who oppose Senator McCarthy's methods to keep silent, or for those who approve. We can deny our heritage and our history, but we cannot escape responsibility for the result. There is no way for a citizen of a republic to abdicate his responsibilities. As a nation we have come into our full inheritance at a tender age. We proclaim ourselves, as indeed we are, the defenders of freedom, wherever it continues to exist in the world, but we cannot defend freedom abroad by deserting it at home.
And Pinter too makes such a call, but obliquely, and by the time he reaches it we have already been bludgeoned by the force of the rest of his words.
I believe that despite the enormous odds which exist, unflinching, unswerving, fierce intellectual determination, as citizens, to define the real truth of our lives and our societies is a crucial obligation which devolves upon us all. It is in fact mandatory.
If such a determination is not embodied in our political vision we have no hope of restoring what is so nearly lost to us the dignity of man.
Pinter's statement of hope seems melancholy, while Murrow's is full of force. Yet I cannot believe that Pinter would be blind to this. And his call for "fierce intellectual determination" is not made of casually-chosen words.
Listening to Pinter's Bush parody, I was reminded of another speech I had heard, back in the days of Ronald Reagan (another target of Pinter's pointed words). It was a speech given not by a proper politician, but a fictional one.
In the seventeenth chapter of St. Luke, it is written that the kingdom of God is within man, not one man nor a group of men, but in all men! In you! You, the people, have the power, the power to create machines, the power to create happiness! You, the people, have the power to make this life free and beautiful, to make this life a wonderful adventure. Then in the name of democracy, let us use that power. Let us all unite. Let us fight for a new world, a decent world that will give men a chance to work, that will give youth a future and old age a security. By the promise of these things, brutes have risen to power. But they lie! They do not fulfill that promise. They never will! Dictators free themselves but they enslave the people. Now let us fight to fulfill that promise. Let us fight to free the world! To do away with national barriers! To do away with greed, with hate and intolerance! Let us fight for a world of reason, a world where science and progress will lead to all men's happiness. Soldiers, in the name of democracy, let us all unite!
This is part of the closing monologue given by Charlie Chaplin as the faux-Hitler in The Great Dictator. In the scene, Chaplin's little tailor, masquerading as der Fuhrer, hopes to bring a broad sea change to the shape of his fascist country through the power of such promises. And yet... even then, many years ago, I felt that with some minor tweaks here and there, such a feel-good speech could have been just as easily given by Hitler himself, rallying his adherents to a kinder, gentler, world where everyone was a brother because everyone had been hammered into the same narrow shape.
And one other speech also came to mind, one that was alluded too some time ago here on botzilla. At the time I was discussing the Melian debate, which I felt so closely matched the events then unfolding at the UN. And Pinter's parody reminded me of the famous oration at The Funeral of Pericles, a speech delivered to a crowd who knew full well of their leaders' actions on Melos:
Our form of government does not enter into rivalry with the institutions of others. Our government does not copy our neighbors', but is an example to them. It is true that we are called a democracy, for the administration is in the hands of the many and not of the few. But while there exists equal justice to all and alike in their private disputes, the claim of excellence is also recognized; and when a citizen is in any way distinguished, he is preferred to the public service, not as a matter of privilege, but as the reward of merit. Neither is poverty an obstacle, but a man may benefit his country whatever the obscurity of his condition. There is no exclusiveness in our public life, and in our private business we are not suspicious of one another, nor angry with our neighbor if he does what he likes; we do not put on sour looks at him which, though harmless, are not pleasant. While we are thus unconstrained in our private business, a spirit of reverence pervades our public acts; we are prevented from doing wrong by respect for the authorities and for the laws, having a particular regard to those which are ordained for the protection of the injured as well as those unwritten laws which bring upon the transgressor of them the reprobation of the general sentiment.
The Athens Thucydides describes not only demanded tribute from its neighbors and murdered those who did not pay it, but fully a third of the Athenian population were slaves, many dragged from previous freedom in those same military conflicts.
George Bush did not invent political duplicity, nor did Hitler create the deceptive feel-good-about-your-country dodge. No, it is an old dance, and one whose rhythms must surely be reflected in some basic part of our human character and biology, for it has been a part of us for a very, very long time.
Had I only listened to Pinter, and had these same connections linking-up in my mind, I would be in a sorry state indeed. His words are wilting. And the history of our species has many faults we'd like to forget. And yet they are not the full story either.
Harold Pinter was not the only Nobel Prize recipient this year, there are other prizes, such as the Peace Prize given to Mohamed ElBaradei and, important to me today, the joint prize in Economics, given to two prominent theorists of game theory: Robert Aumann and Thomas Schelling. Their presentations too are preserved on video and available on the web. And they too stray from what a casual observer might consider the core topic of their awards.
Alfred Nobel started his prizes to help the world, a world he felt he had harmed by the same business that brought his great monetary fortune: the creation of dynamite and the many weapons that came from it. So perhaps it is not so surprising that just as Pinter strayed quickly from purely literate concerns to those of life and responsibility and dignity on our planet, so too do these economists: Schelling's address covers his theories regarding international gamesmanship and the remarkable fact that over the past sixty years, no nuclear weapon has ever been used in anger, since that week long ago in Hiroshima and Nagasaki. And Aumann's lecture roots at the basic causes for this, revealing curiously counter-intuitive arguments for peace and cooperation through the use of ready but unused deadly force. He dedicated his talk to the memory of the pacifist Leo Tolstoy.
Aumann and Scheller have reduced conflict to an analysis of interactive choices. They are game theorists, and have pushed past such classic game-theory paradoxes as the prisoners' dilemma to examine what they call the theory of repeated games. Rather than looking only at strategies of "how can I win this game," they considered what are the best strategies for winning a game repeatedly? In games where there are potentials for both winners and losers, they found that often the best strategy for long-term success was not in domination (and requisite subjugation of the loser), but in cooperation. That unless the initial payoff of domination was so great as to effectively end the game, that it could not be sustained as long as the loser had the ability to retaliate and punish.
In the extreme case, we have games such as nuclear MADD where the punishment is absolute and final. In lesser cases we have governments, companies, and individuals who come to recognize that their best long-term goals are the ones in which they all participate in both the game and the rewards.
Pinter's words imply a knowledge of what is right and what is wrong on the part of the listener. They appeal to the desirability of truth without having to explain why truth is valuable. Murrow likewise addresses an audience whose values, surpassing issues like momentary political gain, place truth and fairness and broad equality at their cores.
Aumann and Scheller, building upon the work of prior Nobel laureate game theorists like John Nash (who's had a movie biography of his own), pull back the underpinnings of those values to reveal just how powerful they can be. Repeated game theory describes and accounts, not only for punishment and revenge, but more importantly for cooperation, for charity, and for altruism. In Aumann's lecture, he declares two key points:
Using private information reveals it. That time reveals all truths. Repeated games are ultimately about the very issue Pinter addresses: truth. Eventually the players strike a mutual equilibrium, a cooperative outcome in which dominance cannot be sustained.
Aumann's analysis surely comes from a warm-hearted man, but it is an analysis built of cold solid bricks, with evidence and ramifications that are social, biological, economic, rational, mathematical. As surely as Einstein's equations laid-out principles that can describe both the a-bomb and television, so do these game-theory principles lay down the case for hope.
Yes, duplicity and the lust for power have always been with us. But over repetitions, things do improve. Slavery in Athens has been gone for a very long time. Hitler's gas chambers may not be the end of state brutality and democide, but such horrors would be hard to hide in today's world. There are no weapons of mass destruction, and the world has come to know it. Truth will prevail.
Two steps forward, one step back. Two steps forward.
Truth its revelation, propagation, and accountability is thus both the mechanism for our potential salvation and part of its reward.
(It has been a while since I've written an entry under this category. Maybe it's been building-up! Written in one sitting, forgive the awkward sentences (link repairs, 20 jan))
Older Entries:19 January Paxil & Pandas