The name of the blog, Creative Destruction, is correct, but only partially. The definition offered at Wikipedia, drawn from Austrian economics, is a “process of industrial mutation that incessantly revolutionizes the economic structure from within, incessantly destroying the old one, incessantly creating a new one.” The term proposes an ongoing process of birth, death, and rebirth. With U.S. presidential election results a little more than a month old and the inauguration a little over a month away, we have embarked on the path of political, economic, and cultural transformation with few clear objectives other than jettisoning progressive ideology and instituting radical conservatism. It will be the reverse of the last change of administration: hope without change (Obama) vs. change without hope (Trump). Thoughtful consideration would suggest we will get only the destructive part of creative destruction and that revolution, mutation, and creative rebirth will be long delayed, if indeed they ever come at all.
December 13, 2016
August 3, 2015
Everyone knows how to play Rock, Paper, Scissors, which typically comes up as a quick means of settling some minor negotiation with the caveat that the winner is entirely arbitrary. The notion of a Rock, Paper, Scissors tournament is therefore a non sequitur, since the winner by no means possesses skill, or strategic combinations of throws devised to reliably defeat opponents. Rather, winners are the
unfortunate recipients of a blind but lucky sequence, an algorithm, that produces an eventual winner yet is indifferent to the outcome. I can’t say quite why, exactly, but I’ve been puzzling over how three-way conflicts might be decided were the categories instead Strong, Stupid, and Smart, respectively.
Rock is Strong, obviously, because it’s blunt force, whereas Paper is Stupid because it’s blank, and Scissors is Smart because it’s the only one that has any design or sophistication. For reassignments to work, however, the circle…
View original post 464 more words
April 17, 2015
When any given technology reaches maturity, one might think that it’s time perhaps to stop innovating. A familiar, reliable example is the codex, also known as the book, now many centuries old and an obvious improvement over clay tablets and paper scrolls. Its low cost and sheer utility have yet to be surpassed. Yet damn it all if we don’t have inferior alternatives being shoved down our throats all the time, accompanied ad naseum by the marketers’ eternal siren song: “new and improved.” Never mind that novelty or improvement wasn’t slightly needed. A more modern example might be Microsoft Word 5.1, dating from 1992, which dinosaurs like me remember fondly for its elegance and ease of use. More than 20 years later, Microsoft Office (including MS Word) is considered by many to be bloatware, which is to say, it’s gone backwards from its early maturity.
So imagine my…
View original post 790 more words
May 1, 2014
One the blogs I read and contribute to recently blew up over the subject of conspiracy theories. Among the arguments was the following video:
Skeptic.com purports to promote rational and scientific thinking through the use of humor, but I must admit its approach is not my cup of tea. I have seen several other videos featuring the character Mr. Deity and thought then the tone was high-handed despite the humor (more like sarcasm and ridicule). Whether I agree (or disagree) with the viewpoint presented is quite beside the point.
I wish that various conspiracies could be laid to rest finally, and maybe the authors at Skeptic.com believe they have done so, but there are significant sociological reasons why belief in conspiracy persists. Most examples I discard as not deserving a decision one way or the other, but a couple I believe because I find the evidence convincing and official narrative unconvincing. Yeah, sometimes I feel silly subscribing to ideas others find bizarre, but then, lots of people believed (and still do) that the rush to war in Iraq was justified by disinformation provided by our own government. I didn’t need hindsight to see through that charade.
November 10, 2013
Saw a curious YouTube video, courtesy of Slipped Disc, Norman Lebrecht’s blog at Arts Journal:
I puzzled for a short while about how independent mechanical devices could sync up. The first commentator at Slipped Disc identifies the phenomenon as entrainment, which is accurate except that the comment refers to music therapy. With metronomes, however, there is no nervous system at work as with entrainment in humans. Rather, this video merely demonstrates a property of physics, also called entrainment, whereby interacting oscillating systems achieve mode lock or sync to the same period. In fact, the Wikipedia link in the previous sentence includes a CBS News report assuring everyone that it’s merely physics. This property was observed 350 years ago. Let me draw attention to the fact that the floating tray on which the metronomes sit moves sufficiently (left and right in the video) to allow the devices to interact. In truth, it took me only a little poking around to uncover the physics of it.
September 27, 2013
I see from having searched for this ad that LG has a history of pranking people with the realistic quality of its HDTVs, at least when not paying close attention:
I’m unsure whether such shenanigans ought to be deplored or admired. Scaring the bejesus out of people with intrusions into the privacy of the men’s room, disappearing elevators floors, and now an apocalyptic meteor strike during a job interview seems like crossing over the line. If I were subjected to such pranks, I’d be pissed.
April 17, 2013
Long time no blog posts. I’ve been fairly active at The Spiral Staircase but not at all here. However, I got hipped to an animator, Steve Cutts, whose style and content fits my thinking. Gotta share it. In a recent video, he shows humanity to be pretty hideous in the way we treat the world (ours to kill, consume, and trash at will) yet blithely ignorant about it right up to the end, when we deserve to get stomped ourselves (like the bug at the beginning):
There are other animations at his website with similar themes. The mixture of truly baleful criticism and jokey tone, with mildly distorted drawings and silly though evocative music, makes them simultaneously entertaining and hard to watch. But we have a vicarious, rubbernecking streak in us, so it’s doubtful anyone will look away to preserve their innocence (if anyone can be said to have any).
BTW, to categorize this as content-lite is undoubtedly a mischaracterization, but since the content comes from elsewhere and requires little analysis, I’ve got nothing much to add.
April 29, 2012
This video was just recently sent to me, but it appears to be at least four years old:
Pretty amazing to watch how drivers cooperate in the absence of traffic controls. The speed of the video has been hastened, so it looks a little bit like a Keystone Cops (var.: Keystone Kops) short, but the essence of the activity on the street is still pretty clear. Try that on an American intersection!!
March 15, 2012
This image has been making the rounds:
I admit to being initially taken in by the apparent discrepancy in counting methodologies, but as with so many things, I lack the expertise to fully evaluate the accuracy of the claims. It was no surprise that someone else did, however, as can be appreciated with this YouTube video:
Of course, I never believed in the first place that the Mayan prophesy meant the end of the world. Rather, it was merely the equivalent of the odometer on the car turning over 100,000 back when there was no display for the hundred thousands place, meaning it would reset to 00,000.0.
December 20, 2011
This latest just makes me laugh: a company called Burnt Impressions Inc. in Danville, Vermont, is selling a line of toasters that burn images of one’s choosing into toast.
It’s usually a T-shirt maker that jumps on a trend first, but I suppose if the trend to be coopted is spectral images appearing on grilled cheese sandwiches, then it makes perfect sense to make a toaster than does it for you. Current options include Jesus, the Virgin Mary, the Nativity, the peace sign, and the pot leaf.
I also saw a couple puns worth repeating: Cheesus Christ and Jesus Crust.
August 2, 2011
From msnbc in Europe:
An Italian parliamentary commission has approved a draft law banning women from wearing veils that cover their faces in public.
The draft passed by the constitutional affairs commission Tuesday would prohibit women from wearing a burqa, niqab or any other garb that covers the face in public.
This is making news in Rome, following the French example. Rome!? You know, the home of Vatican City? Since I can only surmise these new restrictions are religious in nature, I can only say look to thine own self for silly, religious costuming:
July 28, 2011
Been absent for a while. Nothing short and sweet to blog about until now, which is a 26-ft. sculpture of Marilyn Monroe’s famous pose from The Seven Year Itch.
I used to work in that building at 401 N. Michigan Ave. in Chicago. The plaza in front has been rebuilt almost continuously in the last decade and has frequently been the site of large, outdoor sculptures. I happened by there today, but the unveiling apparently took place July 15, 2011.
Far be it from me to impose my aesthetic on anyone else, but I can’t not observe how trashy this is, offering passersby the most garish upskirt photographic opportunity ever. And naturally, tacky Americans are only too happy to oblige.
April 14, 2011
March 25, 2011
This is pretty funny: an article on “How to Be a Better Listener” in the Chicago Tribune. In next week’s column, learn how to walk on two legs! But in the meantime, listen up! Here’s the set-up:
Did you know that March is International Listening Awareness Month? According to the International Listening Association (ILA), we only retain about 50 percent of what we hear immediately after we hear it, and only another 20 percent beyond that. So how can we get those percentages to rise?
I suspect the author knows nothing about cognition and makes the usual assumption that increasing those percentages means improved cognition. Well, sorry, that’s not the way perception/memory works. We discard the bulk of immediate perception to make room for new stimuli constantly flowing in. If we didn’t, the tank would overflow and nothing new would get in.
If the article were instead about focusing one’s attention, then maybe there would be something useful in it. She gives five suggestions that mostly amount to the same thing:
- Don’t take notes at meetings.
- Clear your mind.
- Absorb the feedback.
- Don’t argue, understand.
- Body language is key.
All but the last are about eliminating or reducing distractions by getting out of one’s own head and paying attention to someone else. This is good advice all the time. The last is unnecessary: body language is perceived subliminally. Conscious awareness of it is not generally necessary.
January 16, 2011
I gave a speech a bit over one year ago that cited Maslow’s hierarchy of needs (physiological, security, social, esteem, and self-actualization), though I modified it slightly to conform to needs as we now experience them. Primary attention for many of us who identify with the dominant culture has shifted to esteem needs, which include personal worth, social recognition, and accomplishment. However, those values are frequently distorted by seeking empty and vacuous fame and false social recognition. This is especially prevalent among the young, whose physiological and security needs are typically satisfied by parents. Indeed, the young have difficulty imaging scenarios where those needs aren’t met passively, which is to say effortlessly, though the recognition is dawning on many in their 20s that the living standards enjoyed by their elders are difficult to replicate.
A recent study reported on in USA Today describes the very thing I mentioned in my speech, namely, that esteem needs for today’s youth trump other concerns. They prefer praise over things like sex, alcohol, money, or even a best friend. This comes as no surprise to anyone paying attention, as evidence abounds that an entire generation of people have been encouraged to believe the world revolves around them. Similar charges have been levied on baby boomers, but as narcissism indices show, the parents got nothin’ on the kids.
January 7, 2011
From the not-really-news department comes a report of things due to be dropped from use, never to become part of the memories of people just now being born:
- travel agents
- separation of work and home
- books, magazines, and newspapers (and newspaper classifieds)
- movie rental stores
- paper maps
- wired phones
- dial-up Internet service
- forgotten friends (and forgotten anything else)
- the evening news on TV
- cameras that use film
The list goes on, but you get the idea. I doubt pretty seriously that most of these things will go the way of the dodo or that even if they do their prior existence will fail to register on those born after their disappearance. After all, technologies from yesteryear still exist in museums, in films, and in archives that we use and enjoy today. So for instance, books, CDs, and newspapers will simply disappear? Nope. Sorry. They may rise or fall in their prevalence of use, but the sheer fact that libraries collect these media will ensure they will still exist and be useful for a long time yet to come.
Yes, technologies do come and go, but the news of the death of most of these is wildly premature and imprudent. These death notices are predicated on the availability of alternatives that serve the same or slightly updated functions, but the alternatives suffer from their own lack of permanence and vulnerability to failure. GPS is more permanent that a paper map? I doubt it. Encyclopedias will dry up and blow away? Please. They’re moving online, but they still exist. Print media are moving online, too, but the printed page is still valuable and preferred in many instances.
I’m not impressed by techno-Utopian writers who opine breathlessly about the future just over the horizon and how quaint our current technologies will soon be. As recent failures of the iPhone alarm function demonstrate, it might also be premature to embrace every new technology and abandon tried-and-true old tech.
December 13, 2010
Like many American cities, Chicago has a “thing” about its parking. It’s very difficult or impossible in some neighborhoods to find an available space, and the Loop is pretty much a no-go zone unless one is willing to pay upwards of $15 per day to park. Those who can do without cars opt for iGo or Zip Cars. And then there is the whole bizarre parking meter scandal. No need to go there.
So I was curious to learn of a website advocating Chair-Free Chicago to address the longstanding tradition of protecting one’s snow-shoveled parking space with lawn chairs, two milk crates and a board, or other similar contraptions. On a purely practical level, defying someone’s marking of territory, however
illegal wrong it may be to claim public space, is too risky for most. Property damage is too costly and finding another space is prudent.
Of course, the expectation is that one’s neighbors will behave more like criminals rather than neighbors. There is plenty of evidence for that. So while I can appreciate and even applaud Chair-Free Chicago for its advocacy, neighborly behaviors are a luxury that most people abandoned a long time ago, and appeals to our better natures are mostly for chumps. Ironically,
when if things totally melt down and civil unrest becomes widespread, forbearance and a sense of community will be the things we most need.
November 29, 2010
Not sure what possessed me, but I had a look at the General Motors website, which has a curious intro movie that says, in effect, “well, back to the drawing board.” I’m unsure whether GM’s failure to ignite the passions of the car-owning public is solely to blame for the company’s collapse and subsequent bailout. I have read at least two analyses that suggest there were structural labor and financial decisions made in the 1950s and 60s that later bore their fruit and killed GM’s profitability. The idea that GM might be able to wipe the slate clean and start again by designing cars people want to buy is the clear affect of the website, complete with shots of ultramodern architecture and designers at the drafting board, as well as a couple artisans shaping body panels.
GM was relisted on the NY Stock Exchange a couple weeks ago at a little above $34 per share, a huge increase (200+ times) over the amount it eroded to before it was unlisted last year. I have no stock advice, but I do find it curious that many would want to play the “fool me once … fool me twice” game with this company.
September 22, 2010
I spotted a hot-off-the-factory-floor Honda CRZ (I don’t care about the damn hyphen!) on the street in Chicago today. I knew they went on sale last month, but this is the first I have seen. As a former CRX owner, Honda’s development of a replacement for its 80s-era econobox (which was built up to 1991) has been of keen interest to me. The presale reviews I read indicated that Honda got some things right but missed others. My desires align almost perfectly with the CRX but not so well with the CRZ:
- fuel efficient
- modestly sporty drive
- two seats but large cargo area
- mechanical reliability
By all accounts, the CRZ hits the second and fourth points but fails on the first and third. Reliability is an unknown at this point, but the complexity of hybrid designs pose some obvious risks. For a hybrid vehicle to get poorer fuel efficiency than the CRX with a traditional engine of 20 years ago is pretty telling to me. Of course, the new vehicle has to accommodate a host of new safety and comfort demands, but still ….
Walking by it on the street, I had the impression is was too large. The side-by-side comparison with the CRX reinforces that impression:
It don’t think it’s a mere trick of perspective. Maybe I’ll swing by a Honda dealer and test drive one, but in its current incarnation, I don’t think I’ll be giving up my 10-year-old Acura for a new car payment on this misfit.
September 5, 2010
The Globe & Mail reports on a new attempt to enhance traffic safety by displaying a 3D holographic image of a young girl chasing a ball in the street. The image below looks like a parking structure:
Although this initiative is reported to be temporary, it gives rise to all sorts of questions about unintended consequences, such as what happens when someone swerves to avoid the hologram and hits a pedestrian or ignores the supposed hologram and hits an actual pedestrian in the middle of the street. The question I have, which probably won’t be discussed, is why government agencies are using deceptive practices, in the name of presumed safety, to direct public behavior. Can similar deceptions be expected to trick people into paying taxes or voting for politicians, or going to war? Oh, wait. There’s no need to expect those things. We already have them.
August 6, 2010
After more than two years without a new post, I decided that from time to time I will post new entries that are more brief and less analytical than those at my personal blog, The Spiral Staircase. Feel free to comment. Regrettably, I lack administrative access to change the blog theme and with it the stupid turtle head at the top of the page. Oh, well ….
This explanation of what happens when crowds panic is too interesting to pass up. Although many clumsy thinkers tend to explain crowd effects in terms of group mind or mob mentality, the explanation provided at the linked page supports the idea that individuals become mindless parts of the crowd, which is itself also mindless. Behaviors of the mob, including the so-called wisdom of crowds, are certainly observable and in some cases even predictable, but they don’t rise to the level of a hive mind or collective consciousness. That mistaken thinking is borrowed in part from observations of the social behaviors of insects and in part from the more imaginative stories of science fiction that examine human consciousness (albeit somewhat intuitively, not knowingly) by projecting collective consciousness on alien cultures.
April 27, 2008
Although this blog has been left for dead by its group of writers, it continues to draw a number of readers. Comments are also mostly dead. However, the post below (cross-posted at my personal blog, The Spiral Staircase) may be of interest to readers who still wander in here. Comments here or there are welcome.
Creeping fascism has been a problem for some years now. Without much recourse short of armed revolt, considering how ineffectual the election process is for instigating real change, many citizens (including me) stood idly by and watched their rights and civil liberties ebb away on a daily basis as the state consolidates its control over all aspects of daily life. The precedent for today’s emerging fully operational security state (or surveillance society, as I’ve seen it called) lies in the early days of the Cold War. Having just emerged triumphant from WWII yet seeing ongoing threats on all sides, many in government began assembling a paranoid and invasive apparatus for gathering intelligence and protecting American interests. It’s almost inevitable that spending one’s life addressing external threats (and increasingly, internal ones) would warp one’s perceptions and judgment, and accordingly, it’s fair to suspect that many operatives both then and now suffer from what the French call a déformation professionnelle.
If you think this is mere hyperbole, I submit you haven’t been paying attention. A quick visit to the U.S. Customs and Border Patrol (CBP) website quickly gives readers the sense that the country is under siege. Its mission statement reads as follows:
CBP is one of the Department of Homeland Security’s largest and most complex components, with a priority mission of keeping terrorists and their weapons out of the U.S. It also has a responsibility for securing and facilitating trade and travel while enforcing hundreds of U.S. regulations, including immigration and drug laws.
My visit to the website was for a simple customs issue, but navigating the site and perusing its content was more than a bit spooky. The front-and-center pointer to terrorists and weapons, while a legitimate concern of the agency, may not be a primary concern of the citizenry except for the agency’s Orwellian interest in keeping everyone constantly on edge. Blissfully missing was a flashing banner with the current alert level status, which is discomfiting enough when it blares over PAs at airports and transportation hubs, as though travelers had any meaningful response. (Reminds me of the air raid sirens tested on the first Wednesday of each month during my youth — rather needless in retrospect, since no one was every really coming for us.) Indeed, the website appears to be equally informational and public relations efforts, with public opinion toward its mandate being shaped heavily.
More significantly, consider that many functions of state security and surveillance are now being handled by InfraGard (isn’t the misspelling of guard rather cute?), a private organization with chapters throughout the U.S. that works in conjunction with the FBI. This is from its website:
InfraGard is an information sharing and analysis effort serving the interests and combining the knowledge base of a wide range of members. At its most basic level, InfraGard is a partnership between the FBI and the private sector. InfraGard is an association of businesses, academic institutions, state and local law enforcement agencies, and other participants dedicated to sharing information and intelligence to prevent hostile acts against the United States. InfraGard Chapters are geographically linked with FBI Field Office territories. Each InfraGard Chapter has an FBI Special Agent Coordinator assigned to it, and the FBI Coordinator works closely with Supervisory Special Agent Program Managers in the Cyber Division at FBI Headquarters in Washington, D.C.
This arrangement has been criticized by The Progressive as effectively deputizing private industry to spy on people and granting business leaders unwarranted access to “an FBI secure communication network complete with VPN encrypted website, webmail, listservs, message boards, and much more.” As with privatization of many former functions of the military, this is more than a little bothersome.
But it gets worse. A book by Nick Turse titled The Complex: How the Military Invades Our Everyday Lives describes how fully the Pentagon has infiltrated and coopted everything for its purposes, which bears comparison to the movie The Matrix as a comprehensive thought control experiment brought to life. A lengthy excerpt appears in an article in TomDispatch.com with preliminary commentary, from which I quote this portion:
At one point in his farewell speech, Eisenhower presaged this point, suggesting, “The total influence — economic, political, even spiritual — [of the conjunction of the military establishment and the large arms industry] is felt in every city, every State house, every office of the Federal government.” But only Hollywood has yet managed to capture the essence of today’s omnipresent, all-encompassing, cleverly hidden system of systems that invades all our lives; this new military-industrial-technological-entertainment-academic-scientific- media-intelligence-homeland security-surveillance-national security-corporate complex that has truly taken hold of America.
And yet more bad news was delivered over the weekend, at least if you subscribe to the famous Benjamin Franklin quote: “Those who would sacrifice essential liberties for a little temporary safety deserve neither liberty nor safety.” Articles in The Washington Post and The New York Times (and elsewhere) describe how the Justice Department, rather than acting as a check on the excesses of the Executive Branch, has given support to Bush’s authoritarian interpretation of the Geneva Conventions, stating that interrogation techniques used would be judged on a sliding scale depending on the identity of the detainee and the information he or she is believed to possess. I’ve blogged before on the use of torture by our government, and despite its repugnance to most of the public, different branches of government — in defiance of international treaties — still insist upon it as a necessary tactic.
It’s difficult for me to imagine the motives behind authoritarian types for whom the modern security state would have been the wet dream of budding Cold Warriors. Are they benevolent tyrants, protecting the population for its own good, or mere profiteers, gathering riches, power, and influence to themselves? And is there some point at which the moment will crystallize into a realization by the general public that the U.S., with its gargantuan military budget and astonishing level of incarceration, has devolved into a fascist state run by a despotic oligarchy?
March 7, 2008
Cheating on a standardized test isn’t exactly unheard of, especially when competition is tough and the stakes are high. (Otherwise, who cares?) Recently, a student in Bangkok used a watch-phone capable of receiving text messages to take university entrance exams, which resulted in a variety of watches with similar capabilities being banned from test sites. The educational establishment is (so far) unwavering in its insistence that students learn and commit material to memory prior to taking exams. A chink in that armor appeared with the approval of using calculators on tests. An argument can be made, however, that open-book or open-source (as in electronics) testing should be considered for the future.
A student who has mastered and memorized a body of knowledge has indisputable advantage over another who has to search for that same information, but it’s a sign of the times that fewer students, and more importantly, fewer businesses, believe that it’s worthwhile to possess information except in the rarefied instance of test taking. In the real world, looking something up and problem solving on the fly is challenging the notion that acquired knowledge and skill give people better (read: more efficient) job performance.
I’ve yet to see any substantial evidence that the Google effect or the Wiki effect — the outsourcing of memory, in short — has significantly diminished the value of traversing a large body of knowledge to be prepared for adult life, be it the level of a high school diploma or an advanced university degree. However, it’s clear that the communications age and its technologies have placed at our fingertips amazing information resources that many of us consult daily. Ironically, that has inadvertently cheapened the value of expertise in many walks of life, as most anyone with a few functioning brain cells can easily acquire the information to handle most of life’s tasks and quite a few job requirements. Our attitudes toward what constitutes cheating have been similarly degraded as the obvious utility of all types of workaround sweep aside ethical considerations. It remains to be seen whether educators, who themselves are known to indulge in cheats of one sort or another, can uphold the value of learning the traditional way. If it were left to business, we’d all be cheating.
February 23, 2008
February 19, 2008
This report at Hitwise compares the ages and incomes of users of Yahoo! to those of Google. The differences appear to be negligible or at least subtly shaded, unless I’m reading the data wrong. However, in my circle, the perception has long been that Yahoo! was pretty well eclipsed by Google. That perception is apparently untrue, which accounts in part for Microsoft’s interest in acquiring Yahoo!, which hadn’t made sense to me before.
Best of all, though, are the labels assigned to various demographics in this graph:
What does one do if more than one marketing niche might be a reasonable fit? Here they are in a list:
Rural Villages and Farms
What’s the difference between Varying Lifestyles and American Diversity, or Small-Town Contentment and Remote America? And don’t all of these demographics have an normal age range?
I’m used to being reduced to a number based on my consumption and spending habits. Undoubtedly, one (or more) of those demographics fits me, despite my knee-jerk disdain for such things. Without a specialization in marketing, though, I still can’t believe that such information is worthwhile to makers of products and providers of services. Maybe someone else knows better.
February 16, 2008
MSN Money has a brief article, by now rather old (two years is definitely old — shoot, one month is old by journalistic standards), examining the economics of eating out vs. preparing home-cooked meals. Based on a fatuous calculation of the “costs” associated with preparing food at home (time spent shopping and preparing plus the food itself) the article finds that it’s cheaper to eat out.
This is a bizarre conclusion. The reduction of the complex of activities involved in eating to a business calculation is myopic in the extreme. In fairness, other considerations are given some attention, too: the health of restaurant food, the size of portions, and the obesity epidemic in the U.S. But the bottom line for this article appears to be getting oneself fed — as though the most efficient ways of getting that done both monetarily and in time are the best ways of calculating value — in order to return to productive activity, that is, making more money. I suppose eating while working is the most economically efficient way of getting fed by the logic of this article. Or perhaps just forgo eating at all.
It’s worth remembering from time to time that it’s the journey, not the destination, that’s important. Although not all meals can or should be extraordinary culinary experiences, the entirety of shopping, preparation, consumption, and clean-up offer an enjoyable process that isn’t well suited to an economical analysis by efficiency experts. Notably absent from the article, for example, is the value of sharing a meal in all its aspects. That’s why people host dinner parties. If it were merely about strapping on the feed bag, there are certainly less taxing ways of filling one’s stomach.
February 3, 2008
The movement of middle class whites from city centers to suburbs in the 1950s and beyond is one of the many effects cars and their infrastructure have wrought on social organization and landscape. Those of us born in the baby boom and after (most of us at this point in history) have a difficult time imagining any other possible way of living besides climbing in the car every day and driving where we go. A handful of U.S. cities have significant enough public transportation to enable some to forego owning car, and I know a few die-hards who try to ride bicycles everywhere, even in the winter.
Jim Kunstler has a phrase he repeats from time to time for the acute blindness most of us share regarding the inevitable changes to the economics of owning and operating automobiles: happy motoring. We act as though the era before cars — the one we can’t remember or fully imagine — is permanently behind us and the availability of cheap energy, whether gasoline, ethanol, or electricity, will never disappear. Peak oil experts tell a different story, and because of that, Kunstler has prophesied that the suburb is already dead but we don’t yet realize it. All that remains to be seen, of course. What’s clear right now at least is that we’ve put all of our eggs in this one particular basket, and until the basket is irrevocably ruined, we’ll continue to act like there will be no end to happy motoring.
In the meantime, a couple curious behaviors related to car culture have caught my attention. In Chicago, we get a couple heavy snows each winter that pretty much grind traffic to a halt. Many people park on the street, and when they dig their cars out, all sorts of things appear on the street to claim the cleared spot: lawn chairs, broken furniture, orange hazard cones, milk crates and boards, etc. The unspoken contract seems to be “I cleared this spot, now you respect my labor and don’t park here.” It can’t possibly be legal to stake out a parking place, and it only happens in the winter after a snow, but it seems pretty clear that one would have to be pretty foolish to remove the lawn furniture, park in the spot, and then leave one’s vehicle worth several thousands (at the least) unattended and vulnerable to whatever vandalism the person(s) who cleared the snow might inflict.
Personally, I would never stake out a spot, though I’ve been disappointed a few times to lose one I cleared, and if I did stake one out, I’d never go the extra step and vandalize the car of someone who moved my lawn furniture out of the way to park. Do I expect others to exercise that restraint? Not on your life. I’m undecided whether this tradition is basically harmless or an instance of hoarding in scarcity. Since I have a dedicated parking spot, I guess I don’t have to decide.
The other behavior having partly to do with car culture is the line of vehicles on the shoulder of the highway into O’Hare International Airport. It’s obvious, I think, that folks are waiting in their vehicles 1-2 miles away from the airport for a phone call from the person they’re picking up rather than circling the terminal or parking and walking to meet their party. It seems like a reasonable approach until one considers that these cars are waiting on the shoulder alongside a highway where people routinely travel 60-80 mph. Blocking the shoulder may not be much of a problem, but merging into traffic from a dead stop is not a maneuver I trust most people to execute either respectfully or safely.
I don’t attend to the local media closely enough to learn that police are ticketing drivers waiting along the highway or that City Hall declared a moratorium on claiming parking spots after snow removal. Perhaps these behaviors pose no particular issue for most. Of course, I’m wondering what will happen when the price of oil spikes and few can afford to rack up 25k+ miles per year. If it’s anything like the horribly stupid movie Blood Car, it won’t be pretty.
February 2, 2008
There is a curious and growing sense that the 2008 presidential race (and the leadership of the free world that follows therefrom) is the Democrats’ to lose, and considering that the two dominant candidates are a woman on one hand and a black man on the other, the U.S. electorate is in a unique position to make history in either eventual result: we will elect a woman or a black man as president — the first in U.S. history — and establish a new political era. Obviously (or maybe not so), this is a distraction from the real issues of American politics, but that putatively history-making event has nonetheless helped erode our self-determination to the pointless and ephemeral issue of electability over governance. As a result, and in a very real sense, we deserve what we get.
Super Tuesday approaches (a catchy if not stupid and reductionist characterization), and yet we many participate blindly in this awful charade that our votes will have some meaningful impact on the outcome: the selection of a candidate for one party or the other. On the Democratic side (I’m unfamiliar with the Republican side), I’ve been chagrined to learn that delegates and candidates both have agreed to set aside a number of states and refuse to campaign and/or award delegates. I’m too much a novice in electoral politics to understand why, for instance, Michigan and Florida shouldn’t matter, so I remain politically naive and ineffectual. Perhaps someone more expert in the nuances of running a campaign within the vagaries of party politics can explain it to me. Failing that, I recognize my participation in the process as a meaningless drop in a flow that has been prefigured by forces with much more to gain or lose than can possibly be left to the whims of the electorate.
So we will make history of a sort. Big deal. I feel confident that none of the “electable” candidates present a prospect for meaningful change. My cynicism runs so deep that no incremental change or adoption of new window dressing is worth more than a moment’s contemplation. The purposeful candidates — those who propose real, substantive change from politics as usual, which is to say, the politics bought and paid for by the highest paying private interests — have already been winnowed from the contest.
But I empathize still with the winning candidate, Democrat or Republican. He or she will inherit such an awful mess — militarily, economically, and culturally — that no brief period of recovery and prosperity is possible to contemplate. We’ve dug for ourselves as Americans a sizable hole from which to extricate ourselves, and it may take generations (or more) to restore even a few of the advantages we have thus far taken for granted and now squandered.
January 28, 2008
Yes all political junkies dream of the brokered convention. It would be exciting!! But I started to think about how the news media would deal with such a thing if it were necessary. The primaries are early. The convention is in August. Between the primaries and the convention the bobblehead discussion would be unbearable. I don’t know how the campaigns themselves would deal with it. They couldn’t go dark, but they couldn’t campaign as the presumptive nominee either. There’d be calls and pressures from various quarters for one of the candidates to “do the honorable thing” and bow out for the sake of the party, or Tim Russert’s Nantucket vacation, or whatever.
Aside from the last part about Tim Russert’s vacation plans, which is obvious snark, this is a highly substantive statement from Teh Atrios. How will all three of the substantive candidates currently in the Democratic side of the race remain until a brokered August convention can sort things out? Can fundraising from the left maintain three mostly-idle campaigns at the national level at the same time while we wait to see what will happen in Denver? And if they do maintain that level of life-support, will any of them be able to start that mad, pell-mell sprint for November 4th at the sound of the cannon? All of these are important questions that we must ask ourselves, the party as a whole.
(And all of them are very good questions that could just as easily apply to the GOP side of the bracket this year, judging by my own personal (and probably amateurish) pre-primary analysis of both Florida and Terminal Tuesday when neither McCain or Romney can pull far enough ahead to keep the other down, much less force Huckabee out of the race. That’s just a prediction, and will not factor further into this post.)
Yet his next post, not ninety minutes later, puts a completely different spin to this line of thinking. And not one to the benefit of Teh Atrios, either.
The existence of multiple candidates in the Democratic primary race means that the party is hopelessly splintered.
As a moderate in this party, I read this as saying the following:
Shut up. Because you’re not picking my candidate, you’re sinking us all. Take MY hand, Luke!
Suddenly I am reminded of what was happening in the Connecticut Senate primary in 2006 between Joe Lieberman and Ned Lamont, when the party really WAS hopelessly splintered. An incumbent Senator lost the primary, yet remained in the race and eventually recaptured his seat. So the question is: why did the party splinter in Connecticut?
Of course, the answer could never be that out-of-state activists like Duncan Black himself, as joined by Jane Hamsher, Markos Moulitsas, and their attendant casts of thousands simply loathed and despised Joe Lieberman and everything he did and said. The answer could never be that they would attempt anything in their power to influence the election of a Senator not in their state. The answer could never be that, without their constant and unwavering support, Ned Lamont would not have defeated Lieberman in the primaries in the first place. The answer could never be that they themselves designed the blueprint for the hopeless splintering of our party when they scribed a bright dividing line, between the moderate wing and the progressive wing, that none shall pass without suffering near-permanent damage to their political careers.
And now I see Duncan Black himself sitting there, bemoaning the fact that the party is “hopelessly splintered”. (Insert prima donna-ish back-of-hand-to-forehead Oh Whatever Shall We Do! pose here.) And I hear this suggestion in the back of my head, one that he wants the rest of us to simply ignore our own decisions and throw ourselves behind the Clinton44 campaign, which he supports with all his heart and body and soul. And all of this simply so that we present a united front in the fall.
Pardon me whilst I call shenanigans here. I’d call something stronger, but all the cow pastures in Wisconsin wouldn’t hold enough of it to add up to the sheer amount of what I’d really prefer to call.
I have seen the dangers of letting the loudest sections of a political party have their way while ignoring the rest. With the GOP, it gave us the rise of religious conservatism. With the Democrats, it is giving us the rise of progressive liberalism. With both, those whose politics are in the middle are effectively disenfranchised and removed from the political process. And from both sides comes great damage to this country’s political structure.
My response is simple. Do not allow anyone, regardless of who or why or where or when or how, tell you who should or must or need receive your vote. Your vote is yours, and yours alone, to cast for whomsoever and whatsoever you so freely decide. No one is allowed to take that away from you. You should not allow them to even passively take it from you, such as by following the advice of a divisive pundit like Duncan Black by voting their way at their own fervent insistence.
If you want to vote for Hillary, then please do so. If you want to vote for Barack, then please do so. If you want to vote for John, then please do so. If you want to vote for Mike Gravel, then please do so. But let it be because you so desire and not because some bobblehead, whether the televised or the virtual variations of the species, told you to vote for Candidate X rather than Candidate Y.
For when you allow someone to choose your vote for you, you allow yourself to fall victim to the most dangerous form of disenfranchisement around: the passive surrender of your vote to a third party.
The concept that an individuals’ personal choice is what truly matters is the philosophical heart of a Democracy. Without it, a Democrat might as well be a Republican.
January 23, 2008
The BBC News has an article reporting that scientists have found evidence to suggest that human evolution is “speeding up.” Scare quotes are used for speeding up in the title of the article for good reason: it’s a reckless remark that can’t be proffered with a straight face. The study on which the article is based
looked specifically at genetic variations called “single nucleotide polymorphisms,” or SNPs. These are single-point mutations, or changes, in the genetic sequence of DNA on chromosomes.
If the mutation is advantageous then it will spread rapidly in the population, along with DNA on either side of the mutation.
It’s unclear to me whether it’s fair to conclude that evidence of a few changes in genetic sequence is tantamount to evolutionary change on the order of species change, which the article never states. Is there a term that describes minor genetic changes without meaningful change in the species? Put another way, isn’t a wide range of genetic variation within the species pretty normal without being evolutionary?
Researchers found evidence of recent selection in 7% of all human genes, including lighter skin and blue eyes in northern Europe and partial resistance to diseases, such as malaria, among some African populations.
This makes me wonder if the usual four mechanisms influencing evolution — natural selection, mutation, random genetic drift, and gene flow — shouldn’t be amended to include cultural election in the case of culturally preferred attributes such as skin type and eye color. (Nope, no suggestion of cultural bias or racial preference there. Move along.)
Also, if I’m not mistaken, when human evolution is discussed by regular folks without specialized training in genetics, the usual context is science fiction and the mode of evolution is either cultural (evolved minds) or biological (evolved bodies) or both. These are wildly divergent from a more narrowly defined science of genetic evolution, which apparently considers even modest change or variation evolutionary.
Without providing suitable context for the science and disclaiming the obvious associations with science fiction, the article invites credulous readers to infer that we’re pointed toward an a evolutionary breakthrough of some sort. What else could “speed up” suggest? The article muddies the waters further with these poorly framed quotes by Steve Jones, a genetics professor at of University College London:
“The general picture that evolution has speeded up in the last 10,000 years as we change from, to put it bluntly, being animals to being humans is clearly true,” he explained. “To suggest it is happening at this instant, I would suggest, is probably wrong.”
“At the moment we are in an evolutionary interval. We are in between two storms. One storm has more or less blown itself out, the storm of farming.”
I won’t bother to comment on the idiotic suggestion that humans aren’t animals. The more immediate problem is timescale. In evolutionary time, 10,000 years is almost nothing. Whether you believe in gradualism or punctuated equilibrium or some blend of both, it typically takes tens of thousands of years to observe changes to the genotype that aren’t merely chromosomal variations. Evolution is happening now, this instant; it’s always happening. But it isn’t instantaneous. Neither is a sunrise. Disclaiming such a thing is absurd to even a novice.
Perhaps it’s worthwhile to remind gentle readers not to get science news from the popular press. Whereas the study may have uncovered something meaningful to a geneticist, it holds almost no value to the general public the way it is reported and veers dangerously toward suggesting things from the realm of science fiction. Science is very good a discovering how things work. It’s not so good at predicting things or even extrapolating trends more than one step beyond the evidence. Take the “suggestion” of human evolution “speeding up” with a sizable grain of salt.