Monday, May 20, 2013
Victor Appleton II's Tom Swift and his Outpost in Space [aka Sky Wheel]
A space station 22,300 miles above the earth is Tom Swift Jr.'s latest project!
Tom's plans for his gigantic hub-and-spoke outpost of the universe calls for twelve laboratories. Solar batteries will be produced in one laboratory, another will be a celestial observatory, and another a radio broadcasting and TV station relaying programs over one third of the earth.
But the project is beset from the start by a fiendish enemy, and also that weird phantom of outer space, Zero Gravity.
Tom comes to grips with the problem of weightlessness by inventing a Zero chamber. Here, in order to master the helpless feeling encountered in space, men are trained to develop a new set of muscular reflexes. Crewmen crawling like flies up and down the walls and across the ceiling of Zero G creates momentary comic relief.
But this is only a prelude to an exciting drama which takes place on a Pacific island, where Tom's rocket fleet is about to blast off. Strange warnings that terrify the natives nearly wreck Tom's plans. How the young scientist overcomes all obstacles and launches his space station makes a gripping book. And each technical detail of this fascinating story has been carefully checked. For those who enjoy the thrill of adventure and the chill of mystery, Outpost In Space is must reading.
Public domain books
Victor Appleton [Wikipedia]
Tom Swift [Wikipedia]
And those wonderful "Tom Swiftie" phrases [Wikipedia]
" I'm an ordained minister," said Tom reverently.
"I've struck oil," Tom said, crudely.
"As soon as the rain stops, we'll break camp," said Tom intently.
"Boy, that sure took the wind out of my sails!" said Tom disgustedly.
"Boy, will I give YOU a haircut!" said Tom barbarously.
"Buy me something to drink?" said Tom dryly.
"Get away from the dynamite," Tom said explosively.
"Give me some more macaroni and cheese, and I'll tell you," said Tom craftily.
"I'd love some Chinese food," said Tom wantonly.
"I collect fairy tales," said Tom grimly.
"I commanded a group of ships for a week," Tom said fleetingly.
"As my sole heir, you get it all," said Tom willfully.
"I forgot what to buy," Tom said listlessly.
"I hate pies with crumb bases," said Tom crustily.
"I joined the Lion's Club," said Tom pride fully.
"I just returned from Japan," Tom said disoriented.
"I know all the wherefores," said Tom wisely.
"I MUST patch this coat." Tom said raggedly.
"I need a home run hitter," Tom said ruthlessly.
"I need a pencil sharpener," said Tom bluntly.
"I only get Newsweek," said Tom timelessly.
"I still haven't struck oil," said Tom boringly.
"I think I'll use a different font," said Tom boldly.
"I want to date around," said Tom unsteadily.
"I was removed from office," said Tom disappointedly.
"I won the daily double," Tom cried hoarsely.
"I don't like hot dogs," Tom said frankly.
"I'll pay off that customs official," said Tom dutifully.
"I'll try and dig it up for you," Tom said gravely.
"I'm back from my lobotomy," said Tom absentmindedly.
"I've made a study of girls," said Tom lassitudinously.
"It's the maid's night off," said Tom helplessly.
"Let's get married," said Tom engagingly.
"Let's visit tombs," said Tom cryptically.
"Look at those newborn kittens," said Tom literally.
"Mush!" Tom said huskily.
"My pencil is dull," said Tom pointlessly.
"Syphilis, sex and fear: How the French disease conquered the world"
Researching the Borgias, Sarah Dunant learnt how syphilis took Europe by storm during the 1490s, and the far reaching effects it's had ever since
May 17th, 2013
History doesn't recount who gave Cesare Borgia syphilis, but we do know when and where he got it. In the summer of 1497, he was a 22-year-old cardinal, sent as papal legate by his father, Pope Alexander VI, to crown the king of Naples and broker a royal marriage for his sister, Lucrezia. Naples was a city rich in convents and brothels (a fertile juxtaposition in the male Renaissance imagination), but it was also ripe with disease. Two years earlier, a French invasion force including mercenary troops back from the new world, had dallied a while to enjoy their victory, and when they left, carried something unexpected and deadly back home with them.
His work accomplished, Cesare took to the streets. Machiavelli, his contemporary and a man with a wit as unflinching as his politics, has left a chilling account of his coupling with a prostitute who, when he lights a lamp afterwards, is revealed as a bald, toothless hag so hideous that he promptly throws up over her. Given Cesare's elevated status, his chosen women no doubt were more enticing, but the sickness they gave him (and suffered themselves) was to prove vicious. First a chancre appeared on his penis, then crippling pains throughout his body and a rash of itching, weeping pustules covering his face and torso. Fortunately for him and for history, his personal doctor, Gaspar Torella, was a medical scholar with a keen interest in this startling new disease and used his patient (under the pseudonym of "Niccolo the young") to record symptoms and attempted cures. Over the next few years, Torella and others charted the unstoppable rise of a disease that had grown men screaming in agony as their flesh was eaten away, in some cases down to the bone.
I still remember the moment, sitting in the British Library, when I came across details of Torella's treatise in a book of essays on syphilis. There is nothing more thrilling in writing historical fiction than when research opens a window on to a whole new landscape, and the story of how this sexual plague swept through Europe during the 1490s was one of the turning points in Blood and Beauty, the novel I was writing on the rise and fall of the Borgia dynasty.
By the time that Cesare felt that first itch, the French disease, as it was then known, had already spread deep into Europe. That same year, Edinburgh town council issued an edict closing brothels, while at the Italian university of Ferrara scholars convened an emergency debate to try to work out what had hit them. By then the method of the contagion was pretty obvious. "Men get it from doing it with women in their vulvas," wrote the Ferrarese court doctor baldly (there is no mention of homosexual transmission, but then "sodomy", as it was known then, was not the stuff of open debate). The theories surrounding the disease were are as dramatic as the symptoms: an astrological conjunction of the planets, the boils of Job, a punishment of a wrathful God disgusted by fornication or, as some suggested even then, an entirely new plague brought from the new world by the soldiers of Columbus and fermented in the loins of Neapolitan prostitutes.
Whatever the cause, the horror and the agony were indisputable. "So cruel, so distressing, so appalling that until now nothing more terrible or disgusting has ever been known on this earth," says the German humanist Joseph Grunpeck, who, when he fell victim, bemoaned how "the wound on my priapic gland became so swollen, that both hands could scarcely encircle it." Meanwhile, the artist Albrecht Dürer, later to use images of sufferers in propaganda woodcuts against the Catholic church, wrote "God save me from the French disease. I know of nothing of which I am so afraid … Nearly every man has it and it eats up so many that they die."
It got its name in the mid 16th century from a poem by a Renaissance scholar: its eponymous hero Syphilus, a shepherd, enrages the Sun God and is infected as punishment. Outside poetry, prostitution bears the brunt of the blame, though the real culprit was testosterone. Men infected prostitutes who then passed it on to the next client who gave it back to a new woman in a deadly spiral. Erring husbands gave it to wives who sometimes passed it on to children, though they might also get it from suckling infected wet-nurses.
Amid all this horror there were elements of poetic justice. In a manifestly corrupt church, the give-away "purple flowers" (as the repeated attacks were euphemistically known) that decorated the faces of priests, cardinals, even a pope, were indisputable evidence that celibacy was unenforceable. When Luther, a monk, married a nun, forcing the hand of the Catholic church to resist similar reform in itself, syphilis became one of the reasons the Catholic church is still in such trouble today.
Though there has been dispute in recent years over pre-15th-century European bones found with what resemble syphilitic symptoms, medical science is largely agreed that it was indeed a new disease brought back with the men who accompanied Columbus on his 1492 voyage to the Americas. In terms of germ warfare, it was a fitting weapon to match the devastation that measles and smallpox inflicted travelling the other way. It was not until 1905 that the cause of all this suffering was finally identified under the microscope – Treponema pallidum, a spirochete bacterium that enters the bloodstream and, if left untreated, attacks the nervous system, the heart, internal organs and the brain; and it was not until the 1940s and the arrival of penicillin that there was an effective cure.
Much of the extraordinary detail we now have about syphilis is a result of the Aids crisis. Just when we thought antibiotics, the pill and more liberal attitudes had taken the danger and shame out of sexual behaviour, the arrival out of nowhere of an incurable, fatal, highly contagious sexual disease challenged medical science, triggered a public-health crisis and re-awoke a moral panic.
Not surprisingly, it also made the history of syphilis extremely relevant again. The timing was powerful in another way too, as by the 1980s history itself was refocusing; from the long march of the political and the powerful, to the more intimate cultural stories of everyman/woman. The growth of areas such as history of medicine and madness through the work of historians such as Roy Porter and Michel Foucault was making the body a rich topic for academics. Suddenly, the study of syphilis became, well, there is no other word for it, sexy.
Historians mining the archives of prisons, hospitals and asylums now estimate that a fifth of the population might have been infected at any one time. London hospitals during the 18th century treated barely a fraction of the poor, and on discharge sufferers were publicly whipped to ram home the moral lesson.
Those who could buy care also bought silence – the confidentiality of the modern doctor/patient relationship has it roots in the treatment of syphilis. Not that it always helped. The old adage "a night with Venus; a lifetime with Mercury" reveals all manner of horrors, from men suffocating in overheated steam baths to quacks who peddled chocolate drinks laced with mercury so that infected husbands could treat their wives and families without them knowing. Even court fashion is part of the story, with pancake makeup and beauty spots as much a response to recurrent attacks of syphilis as survivors of smallpox.
And then there are the artists; poets, painters, philosophers, composers. Some wore their infection almost as a badge of pride: The Earl of Rochester, Casanova, Flaubert in his letters. In Voltaire's Candide, Pangloss can trace his chain of infection right back to a Jesuit novice who caught it from a woman who caught it from a sailor in the new world. Others were more secretive. Shame is a powerful censor in history, and in its later stages syphilis, known as the "great imitator", mimics so many other diseases that it's easy to hide the truth. Detective work by writers such as Deborah Hayden (The Pox: Genius, Madness, and the Mysteries of Syphilis) count Schubert, Schumann, Baudelaire, Maupassant, Flaubert, Van Gogh, Nietzsche, Wilde and Joyce with contentious evidence around Beethoven and Hitler. Her larger question – how might the disease itself have affected their creative process – is a tricky one.
Van Gogh paints skulls and Schubert's sublime last works are clearly suffused with the awareness of death. But in 1888, when Nietzsche, tumbling into insanity, wrote work such as Ecce Homo is his intellectual grandiosity genius or possibly the disease talking? There is a further layer of complexity to this. By the time Nietzsche lost his wits, tertiary syphilis had undergone a transmutation, infecting the brain and causing paralysis alongside mental disintegration. But many of its sufferers didn't know that then. Guy de Maupassant, who started triumphant ("I can screw street whores now and say to them 'I've got the pox.' They are afraid and I just laugh"), died 15 years later in an asylum howling like a dog and planting twigs as baby Maupassants in the garden.
Late 19th-century French culture was a particularly rich stew of sexual desire and fear. Upmarket Paris restaurants had private rooms where the clientele could enjoy more than food, and in opera foyers patrons could view and "reserve" young girls for later. At the same time, the authorities were rounding up, testing and treating prostitutes, often too late for themselves or the wives. As the fear grew, so did the interest in disturbed women. Charcot's clinic exhibited examples of hysteria, prompting the question now as to how far that diagnosis might have been covering up the workings of syphilis. Freud noted the impact of the disease inside the family when analysing his early female patients.
"It's just as I thought. I've got it for life," says the novelist Alphonse Daudet after a meeting with Charcot in 1880s. In his book In the Land of Pain, translated and edited by Julian Barnes in 2002, the writer's eye is unflinching as he faces "the torment of the Cross: violent wrenching of the hands, feet, knees, nerves stretched and pulled to breaking point," dimmed only by the blunt relief of increasing amounts of morphine: "Each injection [helps] for three or four hours. Then come 'the wasps' stinging, stabbing here, there, everywhere followed by Pain, that cruel guest … My anguish is great and I weep as I write."
Of course, we have not seen the end of syphilis – worldwide millions of people still contract it, and there are reports, especially within the sex industry, that it is on the increase in recent years. But the vast majority will be cured by antibiotics before it takes hold. They will never reach the point, as Cesare Borgia did in the early 16th century, of having to wear a mask to cover the ruin of what everyone agreed was once a most handsome face. What he lost in vanity he gained in sinister mystery. How far his behaviour, oscillating between lethargy and manic energy, was also the impact of the disease we will never know. He survived it long enough to be cut to pieces escaping from a Spanish prison. Meanwhile, in the city of Ferrara,his beloved sister Lucrezia, then married to a duke famed for extramarital philandering, suffered repeated miscarriages – a powerful sign of infection in female sufferers. For those of us wedded to turning history into fiction, the story of syphilis proves the cliche: truth is stranger than anyone could make up.
Thanks to POSP stringer Tim.
"What’s the point of political philosophy?"
May 17th, 2013
It is a near-truism that philosophy operates at a remove from the “real world.” Many philosophers suppose that the answers to questions in logic, epistemology and metaphysics are independent of particular empirical facts about how human society happens to be set up. But what about ethics and political philosophy? How far should philosophers concerned with these areas take into account the messy reality of everyday life?
Not far at all, says one venerable tradition that dates back at least to Kant in the 18th century, and probably as far as Plato. From this perspective, the job of ethics and political philosophy is to work out how things ought to be. This need not be closely related to how things actually are. For the philosopher trying to imagine the ideal society or specify the nature of virtue, engaging in detail with the world in its current state (or in its historical forms) may be unnecessary or even unhelpful.
This traditional picture, however, has always had its detractors. In recent years the attack has been led by a group identifying themselves as “political realists,” counting amongst their number philosophers such as Raymond Geuss and the late Bernard Williams. According to the realists, the traditional picture risks making political philosophy both irrelevant and falsely universalistic, mistakenly supposing that the same abstract principles are applicable to societies of radically different kinds. Realists have singled out many of the most prominent political philosophers of the 20th century—John Rawls, Robert Nozick, Ronald Dworkin and GA Cohen—for particular scorn.
The realist critique of these philosophers—let’s call them, by contrast, the “idealists”— encompasses a number of distinct charges, not all of which sit well together. One criticism is that the idealists’ abstract theories of justice are insufficiently engaged with real politics. Another related accusation is that their demands are unrealistic, standing no chance of being implemented. Another common charge, although a very different line of attack, is that idealists—Rawls in particular—are apologists for the political status quo, cooking up a convenient justification for the US’s particular brand of liberal democracy. Finally, the realists sometimes seem sceptical about the whole project of formulating theories of justice, suspecting that such theories are merely ideological devices that obscure power relations, or that there is in fact no universal theory of justice independent of particular societies and their convictions. They argue that trying to design a single political theory to apply to, say, Britain, China and Morocco—not to mention the political cultures of the past—is hopelessly naive.
Although there is something appealing about these lines of criticism, many existing articulations suffer from a mixture of blustering polemic (Geuss allegedly sent a number of colleagues a postcard juxtaposing the pictures of Kant, Rawls, George W Bush and an Abu Ghraib prisoner) and faux-profound obscurantism (see, for instance, Geuss and Williams’s proclamations about the “uniqueness of the political”). They also frequently rely on rather uncharitable readings of the idealists and on a desire to engineer a kind of showdown between rival approaches in political philosophy, a tactic which often obscures points of agreement between traditions.
In the midst of this confused debate, the arrival of David Miller’s richly nuanced, philosophically acute, and finely written book, Justice for Earthlings (Cambridge University Press, £18.99), is very welcome. Miller avoids the label “realist,” preferring a more moderate position which he calls “contextualism.” He argues that, whilst there is some capacity for political philosophy to critique existing political ideas, there is a limit to how revisionary it can be—a limit to the extent of its aspirations to reform both our sense of what justice is and our concrete political system. Rather, its fundamental business is to systematise and make consistent our existing convictions. Moreover, Miller agrees with the realists that in a society with radically different convictions, the appropriate fundamental principles of justice would be different. Most provocatively, as Miller puts it, “justice [is] a human invention that accordingly is shaped by the circumstances of human life.” There is no true theory of justice independent of us.
At first glance this might seem like a much more attractive approach to political philosophy than the idealist one. After all, societies have differed widely in their ideas of justice, and it can seem like cultural imperialism to say that it is only the modern western conception that has got it right. Furthermore, Miller is surely correct that the justifiability of particular political policies sometimes depends on circumstances and on the convictions of citizens. An example may help here. Sikh men are unable to wear regular motorcycle helmets because of their turbans. Passing a law making it compulsory to wear motorcycle helmets would force such individuals to violate their own religious convictions, and, particularly in majority-Sikh societies, would go against the overwhelming democratic will of the people. Thus it would not be legitimate for a government to pass such a law. Yet in a society with no Sikh citizens—where passing the law would not violate religious convictions or go against the democratic will—it might be completely legitimate. So, the circumstances, and the convictions of individuals, make a difference to what is justifiable.
But it is not only contextualists like Miller who can offer this kind of sensitive approach to circumstances. So, too, can many idealists—a fact made clear by a fascinating exchange over the years between Miller and the late, great socialist philosopher GA Cohen. Questions like whether motorcycle-helmet-wearing should be mandated or not, Cohen argued, are not fundamental questions of justice. Rather, there are more fundamental principles—do not force people to violate their core religious identities; do not legislate against the overwhelming democratic will of the people—which themselves explain why we get one verdict in one set of circumstances, and a different verdict in another. And it is only the most fundamental principles, on Cohen’s view, which are universal. So he can acknowledge much of the context-sensitivity that Miller helpfully points us towards.
What divides Cohen and Miller is Miller’s claim that there need not be deeper universal principles of the sort that Cohen envisages underlying particular political prescriptions. So what’s the attraction of Cohen’s idealism over Miller’s contextualism? Well, according to Cohen, we can only explain why particular circumstances must be taken into account by providing more general principles that specify how they do so. Moreover, one requires such fundamental principles to adjudicate more borderline cases. To return to our example: what if the turban-wearing group is a sizeable minority, for example, but nowhere near a democratic majority—what policy is appropriate then?
Miller’s contextualism, and relativist views more generally, can seem appealing due to a humane desire to be tolerant and sensitive to the views and practices of others. We want to make room for people to pursue the good life as they see it, informed by their own historical, cultural and religious perspectives—and not to be told by the state what is good for them and what to value. But, as the old point goes, there is only reason to be tolerant if tolerance has a value that is not itself relative. It’s partly for this reason that philosophers have been so hesitant to move beyond sensitivity to circumstances, and embrace Miller’s bolder claim that justice is a human invention.
One might, alternatively, be drawn to Miller’s work if one is worried that endorsing universal, rather than contextual, principles of justice will lead to enforcing a particular moral conception on citizens against their will. And it is an important insight of political philosophy that not every moral truth is an appropriate basis for the state to legislate on. For instance, it might be that, as private individuals, we are morally obligated to give a lot more money to charities, especially to those in the developing world, than most of us do. But it doesn’t follow that the state would be justified in forcing us to do so. Past a certain point, individual citizens in a liberal society must be free to come up morally short—and it is citizens, not the state, that are accountable for such failings.
But, once again, this does not show that political philosophy should not try to formulate universal principles. It’s just that these principles govern state action, not that of private individuals. The claim that states should not enforce particular moral conceptions on people is itself a moral claim: a claim about what the state has the right to legitimately do. And we need principles to fix the limits of the sphere in which the state is justified in compelling private citizens.
If, like Miller, you think that there need not be such deeper universal principles underlying particular prescriptions, what does make such prescriptions appropriate in one context but not another? For example, Miller suggests that even basic democratic principles are not applicable in some societies. Since formulating political principles is about articulating our convictions, says Miller, democratic principles may be inappropriate in a society where people do not support democracy. And this is not just a hypothetical scenario: recent polling data suggests that democracy does not always find popular support—even in countries such as Libya which have been portrayed as swept up in recent pro-democracy movements.
One way to justify the claim that democratic principles do not apply in such countries would be to say that basic political structures ought to be arranged in the way the citizenry want them to be. Whether convincing or not—and it looks rather self-defeating—Miller would be unlikely to pursue this line of argument. After all, it attempts to explain why democracy is appropriate in some contexts but not others in terms of a deeper principle: political structures should reflect the wishes of the citizenry. Instead, it seems that Miller believes that there is simply nothing more to a principle being “appropriate” than it reflecting peoples’ convictions. This is not just to say that our convictions are all we have to go on in working out what is just. Rather, it is to say that, since justice is a human invention, there is no fact of the matter about what is just: only what we think is just. There is no way to fundamentally justify such claims, and all political philosophy can aspire to do is to articulate them and make them consistent.
This puts us in a somewhat precarious position. When we have convictions about what is just, part of what makes them convictions is the feeling that they are not just arbitrary. When you believe, for instance, that racial discrimination is unjust, you don’t just believe that you believe that racial discrimination is unjust; you believe that racial discrimination is unjust. Of course, you should acknowledge that not all others share your view and that they are entitled to their views. But it’s part of believing that racial discrimination is unjust that you see those who deny this as in at least some way mistaken. It is hard—perhaps even incoherent—to cling on to one’s convictions if one has to simultaneously view them as no more correct than anyone else’s.
Most importantly, it is unclear what the point of giving a systematic articulation of our convictions would be if they were just arbitrary, and did not get at some deeper truth. If the convictions are arbitrary, why should I care what the convictions of my society as a whole are, as opposed to my own convictions, or those of my family, or gym club, or racial group? A picture of political philosophy as giving voice to arbitrary fictions which we just happen to have inherited makes it look significantly more pointless than a picture of political philosophy as making “utopian” demands that do not look like being fulfilled any time soon.
Admittedly, Miller would probably not accept this somewhat nihilistic picture. He does think that political philosophy should be more than a description of how we currently think. The question, however, is whether this is reconcilable with his view that justice is an invention, and that even the most fundamental norms should not be applied where they are not accepted. As many radical political philosophers have stressed, one key task of political philosophy is to alert us to the ways in which our existing political beliefs are distorted by bias and ideology, and thereby to seek to revise them. But the notion of our beliefs about justice being distorted by bias does not even make sense if there is no justice independent of our beliefs, as Miller’s claim that justice is a human invention implies. Things can only be distorted if there is a truth to distort.
That said, one can acknowledge Miller’s insights—retaining his objection to “utopian” political philosophy which makes demands that stand no chance of being achieved—without signing up to his entire programme. It may be right that political philosophers will marginalise themselves by seeming wildly out of touch with reality. Yet it may also be true that it is sensible to “start the bidding high” when making demands about changes to our political systems. Striking the balance here is a difficult task, and what strategy works best is an empirical question which we should not try to settle from the armchair. Nevertheless, these questions are not fundamentally about what justice is, but rather about how best to realise it.
Miller’s book raises one final, more pessimistic, question. Does the exact methodology of political philosophy really affect its chance of influencing political practice? It is true that a more “realistic” set of demands will be more relevant in the sense that they could be implemented if anyone were listening, but the reality is that, with the possible exception of a few very high-profile figures such as Rawls, most political philosophers do not influence public policy to a great degree. If this pessimism is apt, then it is unclear what the point of trying to make political philosophy more “relevant” is. A subtle methodological shift in the way political philosophy is practiced is unlikely to give it significantly more influence. In light of this, we should think twice about compromising its critical function, insofar as it has some influence through universities and think tanks, rather than put it in danger of becoming an apology for the failings of our society.
Justice for Earthlings: Essays in Political Philosophy
Thanks to POSP stringer Tim.
Sunday, May 19, 2013
Ellen von Unwerth
It is a tough row to hoe. More than just knowing people in the industry, one must have talent.
Ellen von Unwerth [Wikipedia]
Ellen von Unwerth [Wikipedia]
"How Bing Crosby and the Nazis Helped to Create Silicon Valley"
May 13th, 2013
The New Yorker
The nineteen-forties Bing Crosby hit “White Christmas” is a key part of the national emotional regression that occurs every Christmas. Between Christmases, Crosby is most often remembered as a sometimes-brutal father, thanks to a memoir by his son Gary. Less remarked upon is Crosby’s role as a popularizer of jazz, first with Paul Whiteman’s orchestra, and later as a collaborator with, disciple to, and champion of Louis Armstrong. Hardly remarked upon at all is that Crosby, by accident, is a grandfather to the computer hard drive and an angel investor in one of the firms that created Silicon Valley.
If today’s youth make up the first digital generation, Crosby’s was the first recorded-music generation. Born in 1903, Crosby grew up in Spokane, Washington, where he spent his latter adolescence haunting record stores and learning the drums, and his twenties on the road as a drummer and singer. He landed in Paul Whiteman’s legendary dance band, touring the country. Vaudeville was fading, as was the belting projection of singers like Al Jolson; jazz, talkies, and the radio were ascendant, with Crosby in the wave.
As Crosby left Spokane, writes Gary Giddins in “Bing Crosby: A Pocketful of Dreams,” acoustical reproduction (yelling into a megaphone so that the sound might be recorded directly onto wax) was giving way to electrical reproduction of music. “That innovation, which dominated the industry for more than two decades (until the introduction of tape), would help bring Bing’s strengths into the spotlight, leading directly to the advancement of his true instrument, the microphone.”
Microphones changed everything. Rather than spraying the balcony with emotion (or using a simple megaphone for amplification) the act of performance became more intimate, the singer more vulnerable. In time, the tinnier carbon microphones (as in the telephone) gave way to condenser microphones. Far more vocal subtlety could be transmitted. The dynamics of entertainment allowed for quiet. A different sort of voice found its place on stage and in recordings: the crooner.
From “The Coming of the Crooners,” by Ian Whitcomb:
The press had a field day disseminating the attacks on the “crooning boom” by moral authorities. In January 1932 they quoted Cardinal O’Connell of Boston: “Crooning is a degenerate form of singing…. No true American would practice this base art. I cannot turn the dial without getting these whiners and bleaters defiling the air and crying vapid words to impossible tunes.” The New York Singing Teachers Association chimed in, “Crooning corrupts the minds and ideals of the younger generation.” Lee DeForest, one of radio’s inventors, regretted that his hopes for the medium as a dispenser of “golden argosies of tome” had become “a continual drivel of sickening crooning by ‘sax’ players interlaced with blatant sales talk.”
Rudy Vallee was the first famous crooner, and the foremost, but Crosby held his own. In the nineteen-thirties he recorded “Learn to Croon”.
All hopes for the abolition of crooning were dashed by the rise of radio, a crooner’s medium. Crosby became a radio megastar. The other greats—like Bob Hope, Fred Allen, and Jack Benny—each came up in vaudeville, and their pacing reveals their early stage training; they project. Crosby did stints in the vaudeville-circuit theatres, too, but the bemused, pipe-smoking, golfing fellow who drifted in and out of song was born of the possibilities of the microphone.
Fast-forward into the mid-nineteen-forties. The Second World War had just ended. Americans were picking over the technological remains of German industry. One of the things they discovered was magnetic tape; the Nazis had been using tape recording to broadcast propaganda across time zones. It was a remarkable invention. Previous sound-recording technologies had used wax cylinders or discs, or delicate wires. But magnetic tape was remarkably fungible: it could be recorded over, cut and spliced together. Plus it sounded better.
Radio shows, however, were supposed to be live. Radio inherited its forms from vaudeville, from variety shows, and it was assumed that the artifice of pre-recording would diminish the audience’s connection, at great risk to the sponsors. Crosby—a master of artifice—didn’t buy that, according to “Bing Crosby: Crooner of the Century,” by Richard Grudens. In 1946 he used his industry power—by then he was on top, one of the world’s richest, most famous and intensely beloved celebrities—to step away from live broadcast by choosing a sponsor and network that would let him use large, wax discs. “Philco Radio Hour” débuted in 1946 on ABC, at thirty-thousand dollars a week. Bob Hope was his first guest.
Meanwhile, engineers interested in tape, having learned what they could from what the Nazis left behind, made their way to Crosby and showed him what the new magnetic technology could do. His interest was more than piqued; he handed fifty thousand dollars to the men from the Ampex corporation, which at that time was just a half-dozen people. The machines they delivered went into use in 1947, and a new Crosby show, edited by tape splicing, was broadcast—the first radio show to use the new technology. Suddenly audio—recorded media—was flexible. It could be cut and pasted, rearranged, and edited.
The Ampex sign still stands over Redwood City; it’s a Silicon Valley landmark. And Ampex still exists as a smaller company focussed on various kinds of recording. But the company is not what it was; for some time, it was a major manufacturer of equipment in America, a key player in early Valley history: as tape recording caught on, along came computers with stored programs. Magnetic tape was an improvement, in many regards, over punched cards or paper tape; it could more readily store data and programs and play them back. From the roots put down through Ampex came a revolution in data storage.
Tapes were still awkward beasts, however—a tape is essentially a long piece of string. If a piece of data is at the end of the string, you have to spin the tape until you get to the end. As anyone who grew up on old machines that used cassettes to store programs knows, with tape the basics of computing—storage, retrieval—take what, to modern sensitivities, feels like an eternity.
In the nineteen-fifties I.B.M. developed a research project to create the RAMAC, for Random Access Method of Accounting and Control. Roughly the size of a washing machine (and that was just the disk), RAMAC was a set of platters that held about five megabytes of data—about as much data as is in a single longish MP3 today. Behold the glory of this majestic device:
This was, of course, the first hard drive, and in “Magnetic Disk Storage: A Personal Memoir,” a man named Albert S. Hoagland, who worked on the RAMAC, cites the Crosby connection—how the singer’s unusual professional needs led to tape recording. There is a direct link in the Silicon Valley understanding between Bing Crosby’s crooning and the rise of the hard drive, which was designed as an improvement over magnetic tape. Or, to put it into an equation: microphones + crooning + Nazis + radio + fifty thousand dollars = Silicon Valley.
RAMAC was victorious, for although you’ll still find tape for data storage, the world belongs to the hard drive. But only for now. S.S.D.s—solid state disks, banks of memory—are taking over. The link to the Nazis and magnetic tape is slowly breaking apart.
Crosby’s career was built on technology, and he used technology to become a master of artifice: to sing as if he were sitting next to you, even if he were in California and you were in New York. He was an investor with a clear motive—a desire to stop recording live—but the ancillary benefits of tape, which could be rearranged with a razor blade, were useful to him as well. It was a pattern of his life: he also invested in fast-freezing technology, and hence became chairman of the board and chief promoter of Minute Maid. When the company went public, he rang the bell at the Stock Exchange. “White Christmas” and orange juice and bad parenting are the memories he left, along with countless songs.
His artifice was a means to an end. Perhaps this is apocryphal, but once while editing his show on tape he asked for a joke to get a different reaction—for a past laugh to be spliced in. Thus, in addition to setting in motion the technologies that brought about the information revolution, he also indirectly created the laugh track.
Saturday, May 18, 2013
This rundown of the elements in numerical order is set to Jacques Offenbach's Infernal Galop, but was otherwise written, produced, and performed by Mitchell Moffit. Here are the lyrics in case you missed anything.
There's Hydrogen and Helium Then Lithium, Beryllium Boron, Carbon everywhere Nitrogen all through the air
With Oxygen so you can breathe And Fluorine for your pretty teeth Neon to light up the signs Sodium for salty times
Magnesium, Aluminium, Silicon Phosphorus, then Sulfur, Chlorine and Argon Potassium, and Calcium so you'll grow strong Scandium, Titanium, Vanadium and Chromium and Manganese
This is the Periodic Table Noble gas is stable Halogens and Alkali react agressively Each period will see new outer shells While electrons are added moving to the right
Iron is the 26th Then Cobalt, Nickel coins you get Copper, Zinc and Gallium Germanium and Arsenic
Selenium and Bromine film While Krypton helps light up your room Rubidium and Strontium then Yttrium, Zirconium
Niobium, Molybdenum, Technetium Ruthenium, Rhodium, Palladium Silver-ware then Cadmium and Indium Tin-cans, Antimony then Tellurium and Iodine and Xenon and then Caesium and...
Barium is 56 and this is where the table splits Where Lanthanides have just begun Lanthanum, Cerium and Praseodymium
Neodymium's next too Promethium, then 62's Samarium, Europium, Gadolinium and Terbium Dysprosium, Holmium, Erbium, Thulium Ytterbium, Lutetium
Hafnium, Tantalum, Tungsten then we're on to Rhenium, Osmium and Iridium Platinum, Gold to make you rich till you grow old Mercury to tell you when it's really cold
Thallium and Lead then Bismuth for your tummy Polonium, Astatine would not be yummy Radon, Francium will last a little time Radium then Actinides at 89
Actinium, Thorium, Protactinium Uranium, Neptunium, Plutonium Americium, Curium, Berkelium Californium, Einsteinium, Fermium Mendelevium, Nobelium, Lawrencium Rutherfordium, Dubnium, Seaborgium Bohrium, Hassium then Meitnerium Darmstadtium, Roentgenium, Copernicium
Ununtrium, Flerovium Ununpentium, Livermorium Ununseptium, Ununoctium And then we're done!!
Tom Lehrer's "The Elements"