Monday, January 31, 2011

Samuel Loyd and the 14-15 Puzzle

Samuel Loyd
January 31st, 1841 to April 10th, 1911


Samuel Loyd was an American puzzlemaker who was best known for composing chess problems and games, including Parcheesi, in addition to other mathematically based games and puzzles. He studied engineering and intended to become a steam and mechanical engineer but he soon made his living from his puzzles and chess problems. Loyd's most famous puzzle was the 14-15 Puzzle which he produced in 1878. The craze swept America where employers put up notices prohibiting playing the puzzle during office hours. Loyd's 15 puzzle is the familiar 4x4 arrangement of 15 square numbered tiles in a tray that must be reordered by sliding one tile at a time into the vacant space.

The puzzle

Fifteen puzzle [Wikipedia]

This is favorite of mine...



Sliding Puzzles: Dad's Puzzle

Foreign film distribution


This is a sad commentary on the status of foreign films being shown in the United States but it is a cold fact that if you are not a big draw name [like Bergman or Kirasawa] it is a simple matter of economics. It costs big bucks to distribute a foreign film especially if it is marginal in content and director status.

"A Golden Age of Foreign Films, Mostly Unseen"

by

A. O. Scott

January 26th, 2011

The New York Times

ONE of the few surprises at the Golden Globes two weeks ago — you’ll be forgiven if you’ve already forgotten about that odd little broadcast — was the award given to “Carlos,” the French director Olivier Assayas’s five-hour-plus reconstruction of the life and career of the notorious terrorist of the 1970s and ’80s Carlos the Jackal. The award represented a high point of cosmopolitanism at a predictably parochial event: 11 languages spoken on screen; dozens of locations across Europe and the Middle East; a polyglot cast led by a Venezuelan star, Édgar Ramírez, who has the potential to become an international sex symbol. What more could you want from a foreign film?

Except that “Carlos” was not nominated for the Golden Globe in that category (the winner was “In a Better World,” from Denmark): it was made for, and first shown on, French television, a fact that also rendered it ineligible for consideration — as a foreign-language or any other kind of film — by the Academy of Motion Picture Arts and Sciences, which announced its nominees last Tuesday. Mr. Assayas’s dark-horse victory at the Globes was for best miniseries or motion picture made for television. Fair enough, given its origins. Then again, “Carlos” has encountered its American audience in the way more and more foreign films do these days: on a handful of movie screens in big cities, and on cable and video-on-demand. So its exclusion from the Oscars seems somewhat arbitrary.

But so does everything else about the way the Academy deals with movies from the rest of the world. An elaborate and mysterious winnowing process pares down the thousands of potential nominees to five. This year they are “Dogtooth” from Greece, “Incendies” from Canada, “Biutiful” from Mexico, “Outside the Law” from Algeria and “In a Better World,” which might be considered the front-runner if you take the Globes as an omen.

“Dogtooth” came and went on a few American screens last spring, and “Outside the Law” had a brief run in December (and may return shortly); only “Biutiful,” whose globally famous star, Javier Bardem, was nominated for best actor, is likely to be playing now at a theater near you. The others will be released in the late winter or early spring, in the hopes of realizing some kind of box office bounce. The usual pre-nomination handicapping — the canvassing of critical opinion and the weighing of popular sentiment — does not apply to these movies, which might in principle make the choices less compromised, but in practice only serves to make them more confusing.

For example, “Of Gods and Men,” Xavier Beauvois’s moving, humane drama about a group of French monks trying to survive and honor their faith in Algeria during a time of terror and civil war, was also snubbed by the Academy. Of course, worthy films are passed over all the time, but such puzzling and capricious neglect happens so often that it can be taken as a yearly reminder of the American film establishment’s systematic marginalization and misapprehension of much of world cinema.

For some reason, the Academy insists on a one-film-per-country rule, which places a large part of the decision-making process in the hands of film industries at least as corrupt and agenda-driven as our own. Why should “Of Gods and Men” have been France’s only shot? And what determines the nationality of a film in any case? Why is Rachid Bouchareb’s “Outside the Law” an Algerian rather than a French film, given that its director is a French citizen and that it was made with mostly French financing and therefore within that country’s extensive legal statutes governing cinematic production? And what makes “Biutiful,” shot in Barcelona with a Spanish cast, a Mexican film?

My point here is not really to pick on the Academy, nor to mystify readers with effusions over movies you may never have heard of and most likely have not had an opportunity to see. My complaint, really, is about the peculiar and growing irrelevance of world cinema in American movie culture, which the Academy Awards help to perpetuate.

There are certainly examples from the last decade of subtitled films, nominated or not, that have achieved some measure of popularity: “Crouching Tiger, Hidden Dragon”; “Pan’s Labyrinth”; “The Lives of Others”; “The Girl With the Dragon Tattoo.” But these successes seem more and more like outliers. A modest American box office gross of around $1 million is out of the reach of even prizewinners from Cannes and masterworks by internationally acclaimed auteurs, most of whose names remain unknown even to movie buffs. This is less a sea change than the continuation of a 30-year trend. As fashion, gaming, pop music, social media and just about everything else have combined to shrink the world and bridge gaps of culture and taste, American movie audiences seem to cling to a cautious, isolationist approach to entertainment.

And the Oscars reinforce this, frequently ignoring accessible and entertaining movies from other countries and settling on a frequently random-seeming list of finalists. Every year, the world turns its attention to Hollywood, and Hollywood remains, in keeping with long tradition, a notably welcoming place for far-flung talent. There has always been room — work, money and even a measure of glory — for British and Australian actors, intercontinental sex symbols and émigré directors seeking freedom or fortune.

Mr. Bardem, a somewhat surprising nominee for best actor this year, has already won for a supporting role in an English-language film (in 2008, for “No Country for Old Men”), and other non-Anglophone artists are occasionally vouchsafed a moment in the spotlight. Pedro Almodóvar won for best original screenplay in 2003, and Marion Cotillard took best actress honors five years later for her performance as Edith Piaf in “La Vie en Rose.” She also won the chance to play wide-eyed, exotic, seriously underwritten love interests for Johnny Depp (in “Public Enemies”) and Leonardo DiCaprio (in “Inception”). The scale of Hollywood’s appetite, its unrivaled power to vacuum up ambition and artistry from around the world, is part of its legend and grandeur.

But it also casts a long shadow over the rest of the globe, which struggles for visibility. In the past (notably in France in the 1980s and early ’90s) there were protests against American cultural imperialism, but those seem to have waned lately. Whether this is because our imperial hegemony has overwhelmed the possibility of even rhetorical resistance or because, on the contrary, the empire is not as mighty as it used to be, is a topic for another day. My concern here is more with cultural protectionism — the impulse not to conquer the rest of the world but rather to tune it out.

I do not want to not scold American audiences for failing to buy tickets to subtitled movies. Not today, anyway. Public indifference to (or ignorance of) films from beyond Hollywood or its “indie” provinces is routinely invoked as a reason that such films are not more widely available. And so the movies vanish into a vicious circle in which their marginal status is at once assumed and assured.

But on the outskirts of the commercial mainstream, in redoubtable art houses and on the upper reaches of the cable-TV spectrum, there is perhaps more variety and vitality than ever. Yes, it is fashionable in some circles to lament the old days when foreign directors and stars — mostly European and Japanese — were household names in many of the same households that housed best-selling French, Italian and Asian cookbooks. But in terms of volume and distinction, the last 15 years also qualify as a golden age. What has changed is the sense of cultural cachet and social currency.

This may itself be a product of superabundance. New technologies and traditions proliferate and cross-pollinate so rapidly that even a permanent resident of the international festival circuit would have trouble keeping track of it all. The list of national cinemas to watch seems to grow every year, so that even a superficial sense of the cinematic state of things can seem to require an up-to-the-minute awareness of what is happening South Korea, Serbia, Kazakhstan, South Africa, Thailand and a dozen other places. New waves of creativity are cresting across Eastern Europe, Southeast Asia and Latin America? Russia is experiencing perhaps the most robust surge in filmmaking since the 1960s. France, Italy and Germany refuse to be ignored.

And then there is Greece. Your hazy recollections of Zorba or Melina Mercouri (if you have them) will not be much help in making sense of Giorgos Lanthimos’s “Dogtooth,” a creepy, funny, elegantly shot allegory of something very weird in human nature. (Language? Power? Sex? Family?) Mr. Lanthimos is part of a generation of Greek filmmakers whose work is iconoclastic, formally daring and sometimes abrasive. These directors, in turn, are part of a loose network that spreads across much of the world, linked by the promise of festival exposure and the challenge of raising money in a worldwide climate of economic constriction.

Their work is almost invisible here, though it commands a fair amount of attention in the flourishing and contentious cinephile wing of the blogosphere. But it is nonetheless available to anyone with the curiosity and patience to navigate the new, fast-evolving cosmos of V.O.D. and streaming Web video. The Academy will sometimes take notice — more often it will not — but a whole world of movies is out there waiting to be discovered.

PDEs now at Canadian McGill University

"New class for the physics department?"

Partial differential equations could become a requirement at McGill

by

Jenna Blumenthal

January 31st, 2011

The McGill Daily

Not since they put a flat screen in the foyer has Rutherford Physics seen such excitement. The building is buzzing with rumours that the inclusion of a course on partial differential equations (PDEs)– a mathematical tool indispensable for physicists – will be the next big change to the majors curriculum.

“PDEs is one of those things that if you want to do physics, it pretty much puts up a wall if you don’t have it,” says McGill Society of Physics Students VP Academic Nina Kudryashova. “It’s so omnipresent.”

Although it’s been brought up, it is unlikely that PDEs will become a requirement anytime soon. “To even give it rumour status is going a little far” Physics Undergraduate Curriculum Committee Chairman Professor Kenneth Ragan says, “and for current [physics] students lacking PDEs, it’s not fatal.”

Physics professors often include higher-level math, like PDEs, in their curriculum on a need-to-know basis: if a particular tool from a math course which is not required for physics majors is needed, the professor will explain it in class.

Potential curriculum changes confront several aching levels of administration. It must first be approved by the Undergraduate Curriculum Committee within the Physics department, and then by the entire department itself. Finally, it is brought to the Academic Policy Committee of the Science Faculty and the Subcommittee on Courses and Teaching Programs.

In a year where a lack of dialogue between students and the administration has been a ubiquitous campus issue, the process of curriculum approval manages to go against the grain. Students participate in committee activities at each level, and certainly within the Physics department, professors are listening. “Students at the UCC are in some sense the most critical” says Ragan, “they are who we get most of our feedback from and we take that feedback very seriously.”

Saturday, January 29, 2011

Deceased--Willi Dansgaard

Willi Dansgaard
August 30th, 1922 to January 8th, 2011

What are the odds that we have drunk some T-Rex pee?

"Willi Dansgaard Dies at 88; Read Climates in Old Ice"

BY

Douglas Martin

January 28th, 2011

The New York Times

It is said that every breath we take contains molecules breathed by Socrates, Jesus and Abraham Lincoln — a notable exception being the ancient air that the Danish geophysicist Willi Dansgaard and his colleagues found trapped deep in the ice of Greenland.

Drilling more than a mile through millenniums of ice, Dr. Dansgaard and a team of researchers journeyed backward through time to analyze trapped bubbles of air containing molecules of oxygen so old they had never been breathed by a human. By analyzing samples, they could gauge the world’s climate during various periods and determine how it had changed.

They found, among other things, that temperatures have changed more suddenly and violently than had long been supposed; around 15,000 years ago, for example, Greenland abruptly warmed by 16 degrees within 50 years. It had been thought that major climate changes occurred over thousands of years.

Perhaps more important, Dr. Dansgaard, who died this month, perfected ways to date icebound gases as well as to analyze acidity, dust and other influences on climatic conditions. Ice-core analysis proved to be a major advance in studying the climate history of the planet, providing evidence that predated other sources of measurement like tree rings, lake sediments and petrified organic matter.

In 1996, he was awarded the Tyler Prize, environmental science’s highest award, as were Claude Lorius of France and Hans Oeschger of Switzerland. Through his own ice-core analysis, Dr. Oeschger also discerned abrupt climate changes, which have become known as “Dansgaard-Oeschger events.”

Dr. Dansgaard’s work had significant implications in measuring carbon dioxide concentrations over vast periods and understanding the dynamics of global warming. One finding was that temperatures and so-called greenhouse gases, which trap heat, move in lockstep. In an interview, William Sweet, author of “Kicking the Carbon Habit” (2006), called this “the most compelling evidence we have of the relationship between greenhouse gases and global temperature.”

Dr. Dansgaard died on Jan. 8 in Copenhagen at 88, the Niels Bohr Institute of the University of Copenhagen said.

He was born in Copenhagen on Aug. 30, 1922, and received all his schooling there, including a doctorate in physics. He first went to Greenland in 1947 to study magnetism and was “bitten with Greenland for life,” he later said, citing “its forces, its bounty, its cruelty and, above all, its beauty.”

After returning to Denmark, he began using a machine called a mass spectrometer to weigh and identify oxygen molecules based on their number of neutrons. That led to an experiment to see if the hydrogen and oxygen molecules in rainwater changed from one rainfall to the next. He put funnels in beer bottles on his lawn to find out.

He discovered that as clouds rise and cool, heavier forms of oxygen (or isotopes — those with more neutrons) will react to the cold sooner, condensing and falling back to earth as precipitation before lighter ones do; a preponderance of heavier oxygen molecules would then indicate colder atmospheric temperatures. The discovery — that the chemical composition of oxygen revealed temperature — has been deemed Dr. Dansgaard’s most important.

He then obtained worldwide rain samples from the International Atomic Energy Agency, which collected samples to track radiation from bomb tests. He again found oxygen isotopes an excellent predictor of temperature and published a landmark paper on the subject in the journal Tellus in 1964.

To see if his findings might be confirmed in Greenland, he visited an American cold war outpost there, Camp Century, which had been set up to plan for possible nuclear attacks over the Arctic circle.

Scientists at the camp, who were studying numerous Arctic phenomena, many unconnected to military utility, were drilling into the ice to study how to differentiate one year of snowfall from the next. They agreed to let Dr. Dansgaard analyze ice samples, using the yearly snow measurements as a yardstick.

Wallace Broecker, the earth scientist at Columbia University who is often credited with introducing the term global warming, said in an interview that the importance of the mid-’60s ice-core work of Dr. Dansgaard and Dr. Oeschger did not immediately register on scientists except as “a curiosity of ice in Greenland.”

But after the drilling of another core from 1979 to 1981, interest skyrocketed. “This was astounding that you could get these temperature changes so rapidly,” Dr. Broecker said. He estimated that Dr. Dansgaard accelerated ice-core study by 10 years.

Dr. Dansgaard is survived by his children, Birgitte, Finn and Trine; six grandchildren; and one great-grandchild.

In his book, Mr. Sweet quoted Dr. Dansgaard as modestly suggesting that his seminal inspiration — that isotopes might be the key to learning the temperatures of past epochs — was his only good idea.

“It was in fact a brilliant idea,” Mr. Sweet wrote, “and for the rest of his working life, with growing crews of oarsmen at his beck and call, Dansgaard would pursue it with the obsessiveness of a Viking raider.”

Deceased--Charles Brittin

Charles Brittin
May 2nd, 1928 to January 23rd, 2011


"Charles Brittin dies at 82; photographer who chronicled movements of 1950s and '60s"

The relatively unknown artist documented L.A.'s beat culture and emerging arts scene, the civil rights movement, the Black Panthers and antiwar protests.

by

Valerie J. Nelson

January 29th, 2011

Los Angeles Times

His unblinking yet compassionate photographs in the 1950s and '60s documented Los Angeles' beat culture and emerging art scene, the civil rights movement here and in the Deep South, the Black Panthers and antiwar protests.

Yet Charles Brittin was relatively unknown.

Sidelined by declining health beginning in the '70s, he faded from the scene as documentary photographers were first being recognized as artists, said Andrew Perchuk, deputy director of the Getty Research Institute, which holds Brittin's photographic archive.

"He was an absolutely critical figure in Los Angeles, because he was at the intersection of so many things that were happening," Perchuk said. "He also was one of the great civil and political photographers of the age."

Brittin, who had liver and kidney transplants in the 1990s, died Sunday of pneumonia at Saint John's Health Center in Santa Monica, said his lawyer, Salomon Illouz. He was 82.

One of the first subjects to fascinate Brittin as a photographer was a sleepy Venice Beach, where he took pictures "freighted with a hushed beauty and forlorn sweetness," according to the book "Charles Brittin: West and South," scheduled to be published in April.

He preserved a pre-gentrified Venice that has all but vanished: Oil derricks jockey with houses and a waterway and decay creeps into the frame of a once-grand colonnade. In "Big Head, Ocean Park" (1957), a slightly disturbing and clownish ticket booth stands sentry at a funhouse.

A chance meeting in the 1950s with seminal beat-scene artist Wallace Berman pulled Brittin into a circle of avant-garde artists who hung out on La Cienega Boulevard at the Ferus Gallery, the influential contemporary art gallery.

Brittin's Venice Beach shack became the group's second home, and he turned into the unofficial house photographer of a crowd that included actors Dean Stockwell and Dennis Hopper, artist John Altoon, curator Walter Hopps and poet David Meltzer.

"He was probably the beat generation photographer," said Craig Krull, a Santa Monica gallery owner who exhibited Brittin's work in 1999.

"A lot of the people Charles took pictures of ended up becoming legendary figures," Krull said. "His photographs are more than just documents of artists and events. They are very incisive and powerful and poetic and tough."

They also have a "romantic resonance," because many of the elements in them are "gone forever," Brittin said in the catalog for the 1999 show.

As the beat movement gave way to civil unrest in the 1960s, Brittin took his camera to the front lines, and his often tightly focused images were filled with raw emotion. One from a 1965 protest at the Federal Building in Los Angeles shows no faces, only body parts — the splayed legs of a black female protester being gripped by a white officer.

His political activism had its roots in his childhood in Cedar Rapids, Iowa, where Charles William Brittin was born May 2, 1928.

He was the youngest of three children of a father who quit teaching and eventually ran a grocery store. "Keenly aware" that his family had "lost status," he came to identify with the oppressed, Brittin recalled in the catalog.

At 15, he moved to the Fairfax district in Los Angeles with his mother after his father died. The liberal student body at Fairfax High School influenced his political views, and he was soon a Marxist "on my way to changing the world," Brittin told The Times in 1999.

He moved again, to Pomona, and after graduating from high school spent several years studying at UCLA.

In the 1950s, he married and divorced twice — and bought his first camera.

His third wife, Barbara, whom he married in 1961, shared his commitment to activism.

While donating money to the Congress of Racial Equality, the couple attended a meeting where the group posed a question: "Who is prepared to be arrested this week?"

"In six months, Barbara was teaching techniques of nonviolent resistance, and I was taking political photographs," Brittin said in The Times in 1999.

He made dramatic black-and-white prints of protests in Southern California and in Mississippi and Louisiana, where he and his wife spent three months in 1965. By the end of the '60s, Brittin was chronicling the Black Panther movement.

"He had an absolutely phenomenal sense of composition," Perchuk said. "Even when he was in the midst of action at a demonstration, he found a perfect way to frame it that conveyed very precisely what was going on."

From 1963 to 1970, Brittin worked as the official photographer in the Los Angeles studio of noted midcentury designers Charles and Ray Eames.

Throughout his career, he also photographed still lifes composed of unlike objects such as a woman's high-heeled feet with an iron-link chain or doll heads.

Brittin's photographs will be featured in "Pacific Standard Time," an exhibit of collected works scheduled to open Oct. 1 at the Getty Center.

When a slowly progressing condition caused his health to deteriorate, he put his cameras away until the 1990s, when his health improved after his transplants.

With Barbara, he lived for decades in Santa Monica Canyon. She died at 74 in 2003. He has no immediate survivors.

Before the civil rights movement, he did not have "the confidence to exploit the opportunities that came my way," Brittin said in the 1999 catalog. "Then, something more important than my personal comfort was at stake, so I was able to be aggressive and do things that seemed unnatural to me."

Humble existence


Interesting perspective.

"Live Like a Grad Student … Forever"

An Oxford academic recommends living on as little as you can and giving away the rest.

by

Toby Ord

January 29th, 2011

Slate

As an academic at Oxford university I don't have an enormous salary, but even so I have made a pledge to donate £1m to charity over the course of my working life.

It wasn't an easy decision, but I chose to do this after realising just how much more good my money could do for others than for me. I'm a research fellow in ethics, and my thoughts on the ethical issues around global poverty have had a dramatic impact on my personal behaviour.

Philosopher Peter Singer—a fellow Australian—said that the money we spend on luxuries could be used to save people's lives in developing countries if we so wished. How then can we justify choosing the luxuries? This is a strong argument, and quite confronting. So I asked myself what standard of living is justifiable. How little could I live on? The figure I came to is around £10,000 a year, including rent, clothes, food, and holidays.

I'm happy to continue the relatively frugal lifestyle I had as a graduate student—it's certainly much better than what most people in the world can afford. I have a nice place to live, a good computer and phone. The things that are most important to me cost very little: such as spending time with my wife and friends, reading books, and listening to music.

I am 31, so I can expect to work for another 35 years. My annual salary over that time will average about £45,000. After tax this should allow me to give away an average of £30,000 a year—amounting to slightly more than £1m by the time I retire. Last year I earned £25,000 before tax and gave away £10,000 of it. I also managed to save some money toward buying a house with my wife, which we will live in, then eventually sell and give away the proceeds.

I do my giving at the end of each year. I've researched which charities are the most effective, and this has led me to support those fighting tuberculosis and parasitic infections in the developing world.

Between 1994 and September 2010, the Bill & Melinda Gates Foundation made grants worth $23.9 billion. Two anonymous donors last year gave £250,000 so that Gipsy Moth IV, the yacht in which Francis Chichester single-handedly circumnavigated the globe in 1967, could remain in Britain. An anonymous benefactor stepped in after a couple in Louisiana lost their home in 2007 in the wake of a dispute surrounding a $1.63 property-tax bill and gave them the deeds to the house.

I calculate that the money I give away can save between 2,000 and 10,000 lives (taking into account the total cost of medication, delivery, and administration). Compared with this, the benefit that I would get from spending the money on myself is clearly quite insignificant. Indeed, there are strong arguments that by donating to the most cost-effective charities, your money goes more than 10,000 times as far as if you spend it on yourself.

I'm not doing this alone. I've started an organization called Giving What We Can, which has 64 members (including Peter Singer). Each of us has pledged to donate at least 10 percent of our earnings to wherever we think it can do the most good. My wife, Bernadette, an NHS doctor, has supported this from the beginning and will give away everything she earns over £25,000.

As a couple, we don't think that what we're doing is really all that sacrificial. We are still left with a joint income that is exceptionally high by world standards. We are donating about a third but are not having to give up the things that really matter to us. In fact, it is probably improving my quality of life. It's not that it gives me a warm glow, but it does give me a certain peace with myself and a sense of purpose.

Some people suggest I should first make huge amounts of money. My answer is that by doing it my way I have attracted others to join me—and our collective giving will be far more substantial than anything I could have made on my own. Together, the members of Giving What We Can have pledged to donate £14m over the course of our careers, which should save between 30,000 and 100,000 lives. I'm not sure I could have made that much for charity by playing the markets.

Ernest Borgnine...opinionated

With his perspective, Borgnine...says modern filmmaking has suffered from too much focus on making money. "Take a look at those old pictures. Even if they're B or C quality, they still have a certain something that catches you. You don't find that today. Today, it's either guns, sex or guns and sex and everybody's killed at the end and everybody's having a ball."

Have to agree. Long gone are the low budget thoughtful films.

"For Ernest Borgnine, career has been an eternity"

by

Bill Keveney

January 28th, 2011

USA TODAY

Ernest Borgnine isn't making a big deal about his 94th birthday.

A phone call from a cousin in Italy, to whom he responds in vigorous Italian, fine. Birthday presents arriving from near and far, fine. But big plans for the evening, at the suggestion of his children? No way.

"It's just an ordinary day. Why make such a big fuss? I have to go out and get drunk or something? Do you mind if I just stay home? That's exactly what I'm doing," he says, punctuating his statement with a hearty laugh.

This is no curmudgeon. Belly laughs are frequent during a 45-minute conversation at the hilltop home of the actor, who will receive the Life Achievement Award at The 17th Annual Screen Actors Guild Awards on Sunday (TNT and TBS, 8 ET/5 PT). He lights up as Tova, his wife of nearly 40 years, stops by to say hello.

Borgnine credits his mother for suggesting that he try acting when he was looking for work after 10 years in the Navy. "She said, 'Have you ever thought of becoming an actor? You always like to make a damn fool of yourself in front of people. Why don't you give it a try?' "

The Connecticut native talks about a varied career of more than 60 years, of acting at Virginia's Barter Theatre for $30 a week, of working with Helen Hayes on Broadway, with Gary Cooper and Spencer Tracy in film and, for a much younger generation, in SpongeBob SquarePants on TV. He and Frank Sinatra signed their Christmas cards Fatso and Maggio, their respective characters in 1953's From Here to Eternity, a breakout role for Borgnine.

It's a snapshot of cinema history when Borgnine talks about looking at Burt Lancaster and fellow nominee Jimmy Cagney as Grace Kelly announced his name as the winner of the 1955 best-actor Oscar for playing the lonely butcher in Marty.

"Jerry Lewis had bet me a buck ninety-eight that I'd win. I'd gone home and taken 198 pennies and put them in a red sock, and as I went up there, they all wondered what I passed to Jerry Lewis," says the actor, pointing to the Oscar on a shelf in the living room.

After initially resisting series TV, ego pushed Borgnine into a medium most film actors wouldn't touch in the early '60s. A kid selling candy couldn't identify the actor but was familiar with Gunsmoke's James Arness and Have Gun Will Travel's Richard Boone. That persuaded him to enlist in McHale's Navy.

"I called my agent. He said, 'What made up your mind?' I said, 'None of your damn business.' " (Don't get Borgnine started on agents. He says one cost him the lead in 1968 film The Shoes of the Fishermen. "Agents. Let's not get into that.")

The TV move helped make him a household name, and it didn't seem to hurt a movie career that continued with The Dirty Dozen, The Wild Bunch and The Poseidon Adventure.

However, Borgnine's story isn't just one of old Hollywood. He's in the upcoming comedy Night Club, and he appeared in last year's RED with Bruce Willis. He's hoping for a sequel. "This time, I want a gun. If you can handle a gun on The Wild Bunch, you can handle anything," he says.

Borgnine's secret for a long career? He's not another pretty face. "I was a character actor. Do I look like a good-looking man? No," he says, flashing that famous gap-toothed smile. "But, see, I keep working when the rest of the boys are retired."

The five-times-married father of three isn't finicky about scripts, either. "I read it. If I don't fall asleep, it's pretty good."

He gets a kick out of those who wonder about his age, relating a suggestion Tova made to British film officials a few years ago. "She said, 'Why don't you invite my husband to speak?' And they waited a long time and they said, 'Is he coherent?' " he says, putting on a British accent. He eventually spoke and received a standing ovation.

With his perspective, Borgnine, a Turner Classic Movies fan, says modern filmmaking has suffered from too much focus on making money. "Take a look at those old pictures. Even if they're B or C quality, they still have a certain something that catches you. You don't find that today. Today, it's either guns, sex or guns and sex and everybody's killed at the end and everybody's having a ball."

Something is missing in the acting trade, too. "You don't see any more Gary Coopers or Spencer Tracys." (He is a fan of Robert Downey Jr.)

Overall, he's not complaining.

"It's been a long and successful (career), successful in many ways. There's been a lot of heartaches and a lot of tears and worrying about jobs and everything else," he says. "But we made it and here we are."

Wednesday, January 26, 2011

Vladimir Nabokov...the lepidopterist


"Nonfiction: Nabokov Theory on Butterfly Evolution Is Vindicated"

by

Carl Zimmer

January 25h, 2011

The New York Times

Vladimir Nabokov may be known to most people as the author of classic novels like “Lolita” and “Pale Fire.” But even as he was writing those books, Nabokov had a parallel existence as a self-taught expert on butterflies.

He was the curator of lepidoptera at the Museum of Comparative Zoology at Harvard University, and collected the insects across the United States. He published detailed descriptions of hundreds of species. And in a speculative moment in 1945, he came up with a sweeping hypothesis for the evolution of the butterflies he studied, a group known as the Polyommatus blues. He envisioned them coming to the New World from Asia over millions of years in a series of waves.

Few professional lepidopterists took these ideas seriously during Nabokov’s lifetime. But in the years since his death in 1977, his scientific reputation has grown. And over the past 10 years, a team of scientists has been applying gene-sequencing technology to his hypothesis about how Polyommatus blues evolved. On Tuesday in the Proceedings of the Royal Society of London, they reported that Nabokov was absolutely right.

“It’s really quite a marvel,” said Naomi Pierce of Harvard, a co-author of the paper.

Nabokov inherited his passion for butterflies from his parents. When his father was imprisoned by the Russian authorities for his political activities, the 8-year-old Vladimir brought a butterfly to his cell as a gift. As a teenager, Nabokov went on butterfly-hunting expeditions and carefully described the specimens he caught, imitating the scientific journals he read in his spare time. Had it not been for the Russian Revolution, which forced his family into exile in 1919, Nabokov said that he might have become a full-time lepidopterist.

In his European exile, Nabokov visited butterfly collections in museums. He used the proceeds of his second novel, “King, Queen, Knave,” to finance an expedition to the Pyrenees, where he and his wife, Vera, netted over a hundred species. The rise of the Nazis drove Nabokov into exile once more in 1940, this time to the United States. It was there that Nabokov found his greatest fame as a novelist. It was also there that he delved deepest into the science of butterflies.

Nabokov spent much of the 1940s dissecting a confusing group of species called Polyommatus blues. He developed forward-thinking ways to classify the butterflies based on differences in their genitalia. He argued that what were thought to be closely related species were actually only distantly related.

At the end of a 1945 paper on the group, he mused on how they had evolved. He speculated that they originated in Asia, moved over the Bering Strait, and moved south all the way to Chile.

Allowing himself a few literary flourishes, Nabokov invited his readers to imagine “a modern taxonomist straddling a Wellsian time machine.” Going back millions of years, he would end up at a time when only Asian forms of the butterflies existed. Then, moving forward again, the taxonomist would see five waves of butterflies arriving in the New World.

Nabokov conceded that the thought of butterflies making a trip from Siberia to Alaska and then all the way down into South America might sound far-fetched. But it made more sense to him than an unknown land bridge spanning the Pacific. “I find it easier to give a friendly little push to some of the forms and hang my distributional horseshoes on the nail of Nome rather than postulate transoceanic land-bridges in other parts of the world,” he wrote.

When “Lolita” made Nabokov a star in 1958, journalists were delighted to discover his hidden life as a butterfly expert. A famous photograph of Nabokov that appeared in The Saturday Evening Post when he was 66 is from a butterfly’s perspective. The looming Russian author swings a net with rapt concentration. But despite the fact that he was the best-known butterfly expert of his day and a Harvard museum curator, other lepidopterists considered Nabokov a dutiful but undistinguished researcher. He could describe details well, they granted, but did not produce scientifically important ideas.

Only in the 1990s did a team of scientists systematically review his work and recognize the strength of his classifications. Dr. Pierce, who became a Harvard biology professor and curator of lepidoptera in 1990, began looking closely at Nabokov’s work while preparing an exhibit to celebrate his 100th birthday in 1999. She was captivated by his idea of butterflies coming from Asia. “It was an amazing, bold hypothesis,” she said. “And I thought, ‘Oh, my God, we could test this.’ ”

To do so, she would need to reconstruct the evolutionary tree of blues, and estimate when the branches split. It would have been impossible for Nabokov to do such a study on the anatomy of butterflies alone. Dr. Pierce would need their DNA, which could provide more detail about their evolutionary history.

Working with American and European lepidopterists, Dr. Pierce organized four separate expeditions into the Andes in search of blues. Back at her lab at Harvard, she and her colleagues sequenced the genes of the butterflies and used a computer to calculate the most likely relationships between them. They also compared the number of mutations each species had acquired to determine how long ago they had diverged from one another.

There were several plausible hypotheses for how the butterflies might have evolved. They might have evolved in the Amazon, with the rising Andes fragmenting their populations. If that were true, the species would be closely related to one another.

But that is not what Dr. Pierce found. Instead, she and her colleagues found that the New World species shared a common ancestor that lived about 10 million years ago. But many New World species were more closely related to Old World butterflies than to their neighbors. Dr. Pierce and her colleagues concluded that five waves of butterflies came from Asia to the New World — just as Nabokov had speculated.

“By God, he got every one right,” Dr. Pierce said. “I couldn’t get over it — I was blown away.”

Dr. Pierce and her colleagues also investigated Nabokov’s idea that the butterflies had come over the Bering Strait. The land surrounding the strait was relatively warm 10 million years ago, and has been chilling steadily ever since. Dr. Pierce and her colleagues found that the first lineage of Polyommatus blues that made the journey could survive a temperature range that matched the Bering climate of 10 million years ago. The lineages that came later are more cold-hardy, each with a temperature range matching the falling temperatures.

Nabokov’s taxonomic horseshoes turn out to belong in Nome after all.

"What a great paper," said James Mallet, an expert on butterfly evolution at University College London. "It's a fitting tribute to the great man to see that the most modern methods that technology can deliver now largely support his systematic arrangement."

Dr. Pierce says she believes Nabokov would have been greatly pleased to be so vindicated, and points to one of his most famous poems, “On Discovering a Butterfly.” The 1943 poem begins:

I found it and I named it, being versed

in taxonomic Latin; thus became

godfather to an insect and its first

describer — and I want no other fame.

“He felt that his scientific work was standing for all time, and that he was just a player in a much bigger enterprise,” said Dr. Pierce. “He was not known as a scientist, but this certainly indicates to me that he knew what it’s all about.”

This article has been revised to reflect the following correction:

Correction: January 26, 2011

An earlier version of this article misstated the year Vladimir Nabokov immigrated to the United States. It was 1940, not 1941.

"Nabokov’s Blue Butterflies"

by

Erin Overbey

January 27th, 2011

The New Yorker

Vladimir Nabokov once said, “A writer should have the precision of a poet and the imagination of a scientist.” The famed author exhibited both equally in his writing and in his non-literary pursuits, which included lepidopterology, the study of butterflies and moths. Although he is of course best known for his intricate novels and essays, the past decade has seen a rediscovery of Nabokov’s entomological ventures. On Tuesday, the Times revealed that a team a scientists had vindicated a nearly seventy-year-old theory of his about the development of the Polyommatus blue butterflies:

[I]n a speculative moment in 1945, [Nabokov] came up with a sweeping hypothesis for the evolution of the butterflies he studied, a group known as the Polyommatus blues. He envisioned them coming to the New World from Asia over millions of years in a series of waves. Few professional lepidopterists took these ideas seriously during Nabokov’s lifetime. But in the years since his death in 1977, his scientific reputation has grown. And over the past 10 years, a team of scientists has been applying gene-sequencing technology to his hypothesis about how Polyommatus blues evolved. On Tuesday in the Proceedings of the Royal Society of London, they reported that Nabokov was absolutely right.

Dr. Naomi Pierce, a co-author of the report, organized four separate trips to the Andes to collect the blues, and then she and her colleagues at Harvard sequenced the genes of the butterflies, as well as comparing the number of mutations each species had acquired. Their research resulted in the revelation that five waves of butterflies came from Asia to America, as Nabokov had originally hypothized.

Butterfly hunting was a popular sport among Russian intellectuals in the early- to mid-twentieth century, and enthusiasts were often referred to as “fly doctors.” Nabokov became fascinated with butterfly collecting and study as a child, and, when he emigrated to America in 1941, he brought his love of lepidoptology with him, taking a job as a curator at the Museum of Comparative Zoology at Harvard University. In an interview he gave to the Paris Review in 1967, Nabokov noted that

The pleasures and rewards of literary inspiration are nothing beside the rapture of discovering a new organ under the microscope or an undescribed species on a mountainside in Iran or Peru. It is not improbable that had there been no revolution in Russia, I would have devoted myself entirely to lepidopterology and never written any novels at all.

The novelist also went on extensive butterfly-hunting expeditions across America while he was working on his masterpiece, “Lolita.” In “Nabokov’s Blues: The Scientific Odyssey of a Literary Genius,” Kurt Johnson and Steven L. Coates observe,

The sublime joy associated with this [entomological] work was due to the butterfly-hunting trips to the West that Nabokov took every summer. Nabokov, who never learned to drive a car, estimated that in the glory years between 1949 and 59, Véra drove him more than 150,000 miles all over North America, mostly on butterfly trips. Those expeditions have taken on the aura of legend among lepidopterists as well as Nabokov’s literary admirers, and it was a habit he maintained, with only the geographical scenes shifting, for the rest of his life.

Blue butterflies can have some of the shortest life spans of any species, and Nabokov seemed particularly fascinated by this quality of ephemeral metamorphosis. In “Speak, Memory,” part of which ran as an essay titled “Butterflies” in the June 12, 1948, issue of The New Yorker, he contemplated the connection between art and the changing subtlety of these fragile insects:

The mysteries of mimicry had a special attraction for me. Its phenomena showed an artistic perfection usually associated with man-wrought things. Such was the imitation of oozing poison by bubble-like macules on a wing (complete with pseudo-refraction) or by glossy yellow knobs on a chrysalis (“Don’t eat me—I have already been squashed, sampled, and rejected”). When a certain moth resembled a certain wasp in shape and color, it also walked and moved its antennae in a waspish, unmothlike manner. When a butterfly had to look like a leaf, not only were all the details of a leaf beautifully rendered but markings mimicking grub-bored holes were generously thrown in. “Natural selection,” in the Darwinian sense, could not explain the miraculous coincidence of imitative aspect and imitative behavior, nor could one appeal to the theory of “the struggle for life” when a protective device was carried to a point of mimetic subtlety, exuberance, and luxury far in excess of a predator’s power of appreciation. I discovered in nature the nonutilitarian delights that I sought in art. Both were a form of magic, both were a game of intricate enchantment and deception.

Nabokov’s love of and appreciation for entomology would continue until his last days. Shortly before he passed, in 1977, his son Dimitri recorded the following anecdote in his diary:

A few days before he died there was a moment I remember with special clarity. During the penultimate farewell, after I had kissed his still-warm forehead—as I had for years when saying goodbye—tears suddenly welled in Father’s eyes. I asked him why. He replied that certain butterfly was already on the wing; and his eyes told me he no longer hoped that he would live to pursue it again.

Deceased--Milton Levine

Milton Levine
November 3rd, 1913 to January 16th, 2011


"Milton Levine dies at 97; co-creator of popular ant farm toys"

Uncle Milton's Ant Farm was an instant hit in the fad-crazy 1950s. More than 20 million were sold during Levine's lifetime. 'Humanity can learn a lot from the ant,' he said.

by

Valerie J. Nelson

January 26th, 2011

Los Angeles Times

The creation of a toy that would become an American classic was triggered in 1956 by a Fourth of July parade of ants at a Studio City picnic.

While gazing at the industrious insects, novelty-toy entrepreneur Milton Levine was transported back to childhood and his uncle's farm, where he collected ants in jars and watched them "cavort," Levine told The Times in 2002.

"We should make an antarium," he recalled announcing.

With his brother-in-law, Levine soon devised what was eventually named Uncle Milton's Ant Farm, which was an instant hit in the fad-crazy 1950s. More than 20 million of the now-familiar green ant colonies were sold in Levine's lifetime, according to the Westlake Village company that makes them.

Levine, who was known as Uncle Milton, died of natural causes Jan. 16 at a Thousand Oaks assisted-care facility, said his son, Steven. He was 97.

At first, the ant farm was sold through a mail-order business that Levine established in 1946 in Pittsburgh with his brother-in-law, E.J. Cossman, a gifted pitchman. The whimsical ant community was one more offbeat product from a company that marketed plastic shrunken heads to hang on rear-view mirrors and spud guns that fired potato pellets.

After moving the company to Hollywood in 1952, Levine decided that he needed a "unique product" if the business were to succeed long term, he told The Times in 1986.

Once Levine hit on the idea for the ant farm, models were fashioned out of plastic tissue dispensers. The first farms that were sold — two sheets of transparent plastic that framed sand topped by a farm scene — looked much like they do today. He advertised them in The Times by saying something akin to "Watch the ants dig tunnels and build bridges" and received so many orders for the $1.98 product he "couldn't believe it," he said in the 1986 interview.

The ant farm became a classic partly because it "stoked the curiosity" of budding scientists and provided a fascinating educational experience, said Tim Walsh, a toy historian who last interviewed Levine in 2006 for the documentary "Toyland" (2010).

Because ants won't survive on the store shelf, they are obtained by mailing in a coupon that comes in the box, which added to the toy's mystique, Walsh said.

"Part of the thrill of the ant farm was that you had to wait and check your mail every day" for the 25 or so ants to arrive in a vial, Walsh said.

The insects were gathered by ant rustlers who were paid a penny apiece for red harvester ants from the Mojave Desert.

Over time, the ant farm was tinkered with. The original glue was toxic to some ants, so a replacement was found. Sand made way for white-ish volcanic ash, which made the ants more visible.

Both Levine and Cossman promoted the ant farm on television. Levine made an "executive" ant farm of mahogany and glass for Dick Clark and spoke at length with the puppet Lamb Chop on "The Shari Lewis Show."

In 1965, Levine bought out Cossman, who went on to become a marketing guru. He died at 84 in 2002.

Cossman & Levine Inc. was renamed Uncle Milton Industries, a re-branding that came about partly because Levine often said he was tired of hearing the joke, "If you're in the ant business, where's the 'uncle?'"

Milton Martin Levine was born Nov. 3, 1913, in Pittsburgh to Harry and Mary Levine, Jewish immigrants from Russia. His father started a chain of dry cleaners.

In the Army during World War II, Milton led a platoon that built bridges in France and Germany. He met his future wife, Mauricette, when the French citizen was playing classical piano at a USO in Normandy.

After the war, Levine followed the advice of a newsletter that said the best businesses to go into were toys or bobby pins, both of which were in short supply, he later recalled.

The multimillion-dollar company Levine co-founded became known for educational and scientific toys that include frog and butterfly habitats, planetariums and mini-greenhouses. After the business moved to Westlake Village in the mid-1990s, Levine retired and his son ran the company.

When Uncle Milton Industries was sold in June to Transom Capital Group, a private-equity firm, it was valued at between $30 million and $40 million.

"Ants work day and night, they look out for the common good and never procrastinate," Levine told The Times in 2002. "Humanity can learn a lot from the ant."

More than once, Levine said of ants: "I found out their most amazing feat yet. They put three kids through college."

Besides his son Steven, Levine is survived by Mauricette, his wife of 65 years; daughters Harriet and Ellen; sisters Pearl Cossman and Ruth Shriber; and three grandchildren.

Roger Ebert on 3D


I have to agree with Ebert.

"Two Thumbs, Two Dimensions"

Roger Ebert is done talking about 3-D movies. Thank goodness.

by

Daniel Engber

January 25th, 2011

Slate Magazine

As far as Roger Ebert is concerned, the discussion about 3-D is over. "The notion that we are asked to pay a premium to witness an inferior and inherently brain-confusing image is outrageous," he wrote in his blog Sunday. "The case is closed."

If that means Ebert will stop complaining about the medium, so much the better. For years now, the venerable critic has been griping that 3-D cinema is dim, distracting, and useless. And I mean for years: Even at the age of 10, young Ebert turned up his nose at Arch Oboler's stereo jungle adventure, Bwana Devil. (Deeply unmoved, was he, by the hails of spears.) That was back in 1952; more than a half-century later, he's still shaking his fist at the silver screen—I hate 3-D and you should, too! Professional obligations notwithstanding, Ebert doesn't want to see another movie in three dimensions. Ever.

I've had enough of this persnickety crusade, marching, as it does, under the banner of pseudoscience. "Our ancestors on the prehistoric savannah developed an acute alertness to motion," Ebert writes, in an attempt to explain why movies like Clash of the Titans totally suck:

But what about rapid movement toward the viewer? Yes, we see a car aiming for us. But it advances by growing larger against its background, not by detaching from it. Nor did we evolve to stand still and regard its advance. To survive, we learned instinctively to turn around, leap aside, run away. We didn't just stand there evolving the ability to enjoy a 3-D movie.

OK, let's not quibble with the idea that human beings might have evolved to jump away from oncoming automobiles on the prehistoric savannah. I'm more interested in the two notions that follow from this dubious logic. First, that we ought not consume any form of entertainment that doesn't derive from a selected biological trait; and, second, that standard flat-screen cinema is somehow better suited to our genetic makeup—more natural, I guess—than 3-D.

I wonder if Ebert really believes that the arts should cater to our Darwinian design, or that we're incapable of enjoying anything for which our brain wasn't delicately prewired. But in the event that he does, I'd only point out that such gimmicky and distracting art forms as, say, music, may very well be fiddling with our cortex in ways that have nothing to do with the fight-or-flight demands of a saber-toothed tiger attack.

It's just as silly to presume that viewing a film in 3-D is any less natural—from an evolutionary perspective or otherwise—than watching it flat. For starters, the human eye did not evolve to see elephants stomping across the Serengeti at 24 frames per second. Nor are we biologically attuned to jump cuts, or focus pulls, or the world seen through a rectangular box the sides of which happen to form a ratio of 1.85 to 1. Nor indeed was man designed to gaze at any image while having no control over which objects are in focus and which are blurry. If all those distinctly unnatural aspects of standard, two-dimensional cinema seem unobtrusive, it's only because we've had 125 years to get used to them.

According to Ebert, the 3-D effect brings in an "artificial" third dimension, which doesn't serve to make a movie any more realistic. In fact, he says, it makes an image seem less real, since under normal circumstances "we do not perceive parts of our vision dislodging themselves from the rest and leaping at us." Here he appears to be confusing cheesy, pop-out effects (which are used judiciously in the better—and more recent—films) with the medium as a whole. Yes, some 3-D movies do contain these gimmicks, but others do not.

In any case, it's not clear to me why one depth cue might be deemed artificial and unnecessary, while others are just fine. After all, a regular old 2-D movie carries its own set of visual guidelines for understanding spatial relationships. Objects in the foreground block our vision of what's behind them. Shading and texture tell us about the three-dimensional shape of an object on the screen. Ebert would certainly agree that you don't need to watch the famous sequence from Dial M for Murder in its original 3-D to understand that Anthony Dawson is creeping up behind Grace Kelly, and that he's going to lift a stocking over her head to strangle her. Yet he's apoplectic over the thought of adding one more depth cue into the mix.

With 3-D cinema, we still have occlusion and shading and texture—and we're still missing motion parallax—but now we get the added benefit of binocular disparity. We don't need that extra information to see that Grace Kelly's killer is lurking behind her, but it adds, at the very least, clarity and precision to the scene. Exactly what part of that is "artificial"? As it happens, the 3-D version of Dial M also gives us something more: When Kelly falls across the desk, her hand reaches through the stereo window, as if imploring the audience for help. It doesn't make us jump out of the way like Ebert's Homo habilis. It draws us into the action.

Which brings me to Ebert's latest post, the one described as his final word on "why 3-D doesn't work and never will." To support this claim, he prints a letter from Walter Murch, a decorated film editor and sound designer most notable in this context for sharing Ebert's curmudgeonly disregard for stereo cinema. Like Ebert, Murch complains that 3-D is too dark, and then adds that it's too "small" on the screen. (I think he's referring to the medium's "puppet-theater effect," which tends to make everything and everyone appear shrunken down to the size of dolls.) These problems could be solved, he concedes, but "the biggest problem with 3-D … is the 'convergence/focus issue.' " A stereo film forces the viewers to hold their focus at one plane of depth, even while their eyeballs rotate inwards and outwards to follow the action. "It is like tapping your head and rubbing your stomach at the same time," he goes on. "And 600 million years of evolution has never presented this problem before." (Again with the cavemen …)

This is a reasonable point, and it may represent a real challenge for 3-D filmmakers. I've given my own accounting in Slate: In "The Problem With 3-D," I wondered if the unnatural eye movements provoked by stereo cinema might be the source of the bleary eyes, headache, and nausea that sometimes affect 3-D viewers. This wasn't an original idea, of course—the same concern had been laid out in the Atlantic (to pick just one instance) in 1953, not long after Ebert's dad took him to see Bwana Devil. All these years later, we still don't know whether the "convergence/focus issue" causes 3-D headaches, or if they arise from some other aspect of the experience. Either way, I proposed, the problem of visual discomfort would doom the new batch of digital 3-D films to the same fate as their analog forebears: The bubble will pop.

Thing is, I've changed my mind since I wrote that piece nearly two years ago. Or maybe 3-D movies changed my brain: After watching 10 or 20 of these films since then, I've grown accustomed to the ocular aerobics, and the same format that gave me splitting headaches back in 2009 hardly bothers me now. Meanwhile, certain technical innovations, especially in animated 3-D, have begun to eliminate some of the medium's most egregious visual quirks. And while, like Murch, I'm still distracted by the puppet-theater effect in live-action 3-D, that "problem," too, may diminish as we all get used to it.

If I'm right that it takes multiple viewings to understand and appreciate three-dimensional cinema, you might think Roger Ebert would eventually come around. But even before he'd decided the case was closed, Ebert seems to have sworn off any real engagement with the medium. Armed with his evolutionary theory of film, he's content to sit back and hurl the occasional spear of his own. A recent review of The Green Hornet contained only this note at the very bottom: "Yes, it was in 3-D. The more I see of the process, the more I think of it as a way to charge extra for a dim picture." And while he does commend the effect from time to time—it's "useful" in Tron: Legacy and "quite acceptable" in Megamind—he's rarely willing to acknowledge that 3-D might have anything substantive to offer on its own terms, that maybe it's not only a marketing gimmick (it is that, to be sure), but a new kind of filmmaking that brings along both limitations and opportunities.

Take Toy Story 3: I've gone on record with my admiration for the scene at Daisy's window, where Lotso finds he's been replaced by another toy. There's no sight gag there, no objects hurtling off the screen; instead, the image contorts visual space into a crisscrossing, emotional depth. If the scene were flat, Lotso and Daisy would be right next to each other on the screen; in 3-D, they're spread across a lonely chasm, separated by rain-streaked glass. Is this a fluke, or a sign of what three-dimensional cinema could be? Ebert's not interested. He sums up Pixar's innovative use of stereo with a one-line postscript to his review: "Just don't get me started about the 3-D." Don't get him started; the case is closed. Maybe that's for the best.


Told ya...3D is dying

The fad will fizzle...3D is a doomed novility

Color-coded terror warnings


Japan is keeping theirs.

"Color-coded terror warnings to be gone by April 27"

by

Eileen Sullivan

January 26th, 2011

The Associated Press

By the end of April, terror threats to the U.S. will no longer be described in shades of green, blue, yellow, orange and red, The Associated Press has learned.

The nation's color-coded terror warning system will be phased out beginning this week, according to government officials familiar with the plan. The officials requested anonymity to speak ahead of an announcement scheduled Thursday by Homeland Security Secretary Janet Napolitano.

The Homeland Security Department and other government agencies have been reviewing the Homeland Security Advisory System's usefulness for more than a year. One of the most notable changes to come: The public will no longer hear automated recordings at U.S. airports stating that the threat level is orange.

The Obama administration will take the next three months to roll out a replacement, which will be called the National Terrorism Advisory System. The new plan calls for notifying specific audiences about specific threats. In some cases, it might be a one-page threat description sent to law enforcement officials describing the threat, what law enforcement needs to do about it and what the federal government is doing, one of the officials said.

When agency officials think there is a threat the public should know about, they will issue an announcement and rely on news organizations and social media outlets to get the word out.

Rep. Peter King, R-N.Y., the chairman of the House Homeland Security Committee, said the old threat system served a valuable purpose in the aftermath of the terrorist attacks of Sept. 11, 2001, but that a more targeted system was needed.

"It sounds to me like the changes they are proposing make sense," King said in a statement. "We will have to wait and see how they implement this new, more targeted system. I expect the biggest challenge for DHS will be balancing the need to provide useful and timely information with the need to protect sensitive information."

The five-tiered, color-coded terror warning system, created after the Sept. 11 attacks, was one of the Bush administration's most visible anti-terrorism programs. Criticized as too vague to be useful in communicating the terror threat to the public, it quickly became the butt of late-night talk show jokes.

The government hasn't made changes in the colored alert levels since 2006, despite an uptick in attempted attacks against the U.S. However, the government has changed security protocols since then based on threats. For example, new airport security measures were introduced after an effort to bring down a Detroit-bound jetliner on Christmas Day 2009.

"The old Bush color-coded system taught Americans to be scared, not prepared," said Rep. Bennie Thompson, D-Miss., the top Democrat on the House Homeland Security Committee. "Each and every time the threat level was raised, very rarely did the public know the reason, how to proceed, or for how long to be on alert."

Under that system, green, at the bottom, signals a low danger of attack; blue signals a general risk; yellow, a significant risk; orange, a high risk, and red, at the top, warns of a severe threat. Since the outset, the nation has never been below the third threat level, yellow — an elevated or significant risk of terrorist attack.

The use of colors emerged from a desire to clarify the nonspecific threat information that intelligence officials were receiving after the 2001 attacks.