Tuesday, July 30, 2013

Deceased--George Stocking Jr.

George Stocking Jr.
December 28th, 1928 to July 13th, 2013

"George Stocking Jr., ‘Anthropology’s Anthropologist,’ Dies at 84"

by

Paul Vitello

July 29th, 2013

The New York Times

George W. Stocking Jr., a historian of science who chronicled the norms, customs and tribal beliefs of modern anthropologists, documenting a history of racial bias and ethnocentrism as well as great insights, died on July 13 in Chicago. He was 84.

His death was confirmed by a daughter, Susan Stocking Baltrushes, who said his health had been declining for several years.

Professor Stocking, who taught history and anthropology at the University of Chicago, was best known for his studies of anthropology’s pioneers, most notably Edward Burnett Tylor, the self-taught 19th-century British theorist who is often called the father of the field, and Franz Boas, the German-American émigré who pioneered its practice in the United States.

Colleagues said his work helped produce a culture shift in anthropology during the 1960s and ’70s that heralded a growing respect for cultural diversity throughout society. “He was the in-house social critic,” said Raymond D. Fogelson, a professor of anthropology at the University of Chicago.

Scholars in the field called Professor Stocking “anthropology’s anthropologist,” he added.

Professor Stocking’s first book, a 1968 collection of essays, “Race, Culture and Evolution,” was considered a landmark. Margaret Mead, writing in the journal American Anthropologist, called it “beautifully and painstakingly” illuminating.

Professor Stocking described his objective as redefining the accepted view of anthropology’s “origin story,” one centered on Tylor as a foundational theorist. Tylor is considered to have been the first to define the word “culture” in the modern sense: as a set of knowledge, customs and values that all people acquire from their native environments, whether England or Polynesia.

“Race, Culture and Evolution” profiled Tylor, Boas and others who had professionalized a field that had long been dominated by gentleman amateurs and missionaries. The book traced the ideological fault lines among 19th-century practitioners of anthropology and asserted that Tylor’s standing did not bear a close reading of his work.

Though Tylor acknowledged the existence of many human cultures, and refuted early anthropologists who called dark-skinned people savages of another species, he acknowledged only one cultural ideal, believing that “less civilized” people should accept guidance from more civilized ones, Professor Stocking wrote.

He quoted Tylor from his canonical 1871 text, “Primitive Culture,” “The civilized man is not only wiser and more capable than the savage, but also better and happier.”

Professor Stocking and others linked the widespread scientific acceptance of such ideas to the brutalities of 19th century colonialism, the annihilation of American Indians and other purported mass “civilizing” campaigns.

For Professor Stocking, a genuinely science-based approach to the study of human culture was embodied by Boas, a trained physicist who studied American Indian tribes in the Pacific Northwest and later debunked theories linking race with culture and intelligence. He challenged race’s standing as a legitimate scientific concept, calling it “less a biological fact than a social myth.”

Professor Stocking’s essays on Boas, and a book he edited, “The Shaping of American Anthropology, 1883-1911: A Franz Boas Reader” (1974), were instrumental in reviving interest in Boas among a generation that had not been born when Boas died in 1942.

“Boas had an agenda: to rid anthropology of racism,” said Paul A. Erickson, a professor of anthropology at St. Mary’s University in Halifax, Nova Scotia, and an author, with Liam D. Murphy, of “A History of Anthropological Theory,” a standard text. Professor Stocking’s work, he said, had revived interest in that agenda as well.

George Ward Stocking Jr. was born on Dec. 8, 1928, in Berlin to George and Dorothe Reichard Stocking. His father, a professor of economics at the University of Texas at Austin, was conducting research in Germany at the time. George Jr. was raised mainly in Austin. He attended Harvard, where he became active in radical political movements and joined the Communist Party. After graduating in 1949, he was a labor organizer and factory worker.

He left the party in 1956 and received his Ph.D. from the University of Pennsylvania with a degree in American studies. He later acknowledged that with his ties to the Communist Party during those cold war years, he would never have been hired as an instructor at the University of California, Berkeley, if not for his father’s prominence as an economist and adviser to presidents.

Besides his daughter Ms. Baltrushes, Professor Stocking is survived by his wife, Carol Ann Bowman; three other daughters, Rebecca and Rachel Stocking and Melissa Stocking Robinson; a son, Thomas; two sisters, Sybil Winterburn and Cynthia Peck; 10 grandchildren; and three great-grandchildren.

In a semi-autobiography, “Glimpses Into My Own Black Box,” published in 2010, Professor Stocking turned on himself the same unblinking gaze he had trained on the history of anthropology. The book, an account of his youth and family ancestry, amounts to an ethno-historical description of upper-middle-class life at the end of the 19th century and the first half of the 20th.

The seven years he spent as a Communist, Professor Stocking wrote, taught him the habit of “self-criticism,” which made him “suspicious of master narratives” and open to studying, in effect, the narratives of people who study others’ narratives.

“I have always felt myself ultimately an outsider to the anthropological tribe,” he wrote.


Race, Culture, and Evolution: Essays in the History of Anthropology

ISBN-10: 0226774945
ISBN-13: 978-0226774947
 

After Tylor: British Social Anthropology, 1888-1951

ISBN-10: 0299145840
ISBN-13: 978-0299145842

George Stocking Jr. [Wikipedia]

Ideomotor phenomenon and the good ole ouija board


"How the ouija board really moves"

by

Tom Stafford

July 30th, 2013

BBC NEWS

The mystery isn’t a connection to the spirit world, but why we can make movements and yet not realise that we're making them.

Ouija board cups and dowsing wands – just two examples of mystical items that seem to move of their own accord, when they are really being moved by the people holding them. The only mystery is not one of a connection to the spirit world, but of why we can make movements and yet not realise that we're making them.

The phenomenon is called the  ideomotor effect and you can witness it yourself if you hang a small weight like a button or a ring from a string (ideally more than a foot long). Hold the end of the string with your arm out in front of you, so the weight hangs down freely. Try to hold your arm completely still. The weight will start to swing clockwise or anticlockwise in small circles. Do not start this motion yourself. Instead, just ask yourself a question – any question – and say that the weight will swing clockwise to answer "Yes" and anticlockwise for "No". Hold this thought in mind, and soon, even though you are trying not to make any motion, the weight will start to swing in answer to your question.

Magic? Only the ordinary everyday magic of consciousness. There's no supernatural force at work, just tiny movements you are making without realising. The string allows these movements to be exaggerated, the inertia of the weight allows them to be conserved and built on until they form a regular swinging motion. The effect is known as Chevreul's Pendulum, after the 19th Century French scientist who investigated it.

What is happening with Chevreul's Pendulum is that you are witnessing a movement (of the weight) without "owning" that movement as being caused by you. The same basic phenomenon underlies dowsing – where small movements of the hands cause the dowsing wand to swing wildly – or the Ouija board, where multiple people hold a cup and it seems to move of its own accord to answer questions by spelling out letters.

This effect also underlies the sad case of  "facilitated communication" , a fad whereby carers believed they could help severely disabled children communicate by guiding their fingers around a keyboard. Research showed that the carers – completely innocently – were typing the messages themselves, rather than interpreting movements from their charges.

The interesting thing about the phenomenon is what it says about the mind. That we can make movements that we don't realise we're making suggests that we shouldn't be so confident in our other judgements about what movements we think are ours. Sure enough, in the right circumstances, you can get people to believe they have caused things that actually come from a completely independent source (something which shouldn't surprise anyone who has reflected on the madness of people who claim that it only started raining because they forget an umbrella).

You can read what this means for the nature of our minds in The Illusion of Conscious Will by psychologist Daniel Wegner, who sadly died last month. Wegner argued that our normal sense of owning an action is an illusion, or – if you will – a construction. The mental processes which directly control our movements are not connected to the same processes which figure out what caused what, he claimed.

The situation is not that of a mental command-and-control structure like a disciplined army; whereby a general issues orders to the troops, they carry out the order and the general gets back a report saying "Sir! We did it. The right hand is moving into action!". The situation is more akin to an organised collective, claims Wegner: the general can issue orders, and watch what happens, but he's never sure exactly what caused what. Instead, just like with other people, our consciousness (the general in this metaphor) has to apply some principles to figure out when a movement is one we've made.

One of these principles is that cause has to be consistent with effect. If you think "I'll move my hand" and your hand moves, you're likely to automatically get the feeling that the movement was one you made. The principle is broken when the thought is different from the effect, such as with Chevreul's Pendulum. If you think "I'm not moving my hand", you are less inclined to connect any small movements you make with such large visual effects.

This maybe explains why kids can shout "It wasn't me!" after breaking something in plain sight. They thought to themselves "I'll just give this a little push", and when it falls off the table and breaks it doesn't feel like something they did.

Monday, July 29, 2013

What astronauts found on the Moon and never related


"Hey Diddle Diddle" ["Hi Diddle Diddle", "The Cat and the Fiddle", or "The Cow Jumped Over the Moon"]


Hey diddle diddle,
The Cat and the fiddle,
The Cow jumped over the moon.
The little Dog laughed,
To see such sport,
And the Dish ran away with the Spoon.

Wikipedia...

There are numerous theories about the origin of the rhyme, these include: James Orchard Halliwell's suggestion that it was a corruption of ancient Greek, probably advanced as a result of a deliberate hoax; that it was connected with Hathor worship; that it refers to various constellations (Taurus, Canis minor, the Big Dipper etc.); that it describes the Flight from Egypt; that it depicts Elizabeth, Lady Katherine Grey, and her relationships with the earls of Hertford and Leicester; that it deals with anti-clerical feeling over injunctions by Catholic priests for harder work; that it describes Katherine of Aragon (Katherine la Fidèle); Catherine, the wife of Peter the Great; Canton de Fidèle, a supposed governor of Calais and the game of cat (trap-ball). This profusion of unsupported explanations was satirised by J.R.R. Tolkien in his fictional explanations of 'The Man in the Moon Stayed Up Too Late'. Most scholarly commentators consider these unproven and that the verse is probably meant to be simply nonsense. The melody commonly associated with the rhyme was first recorded by the composer and nursery rhyme collector James William Elliott in his National Nursery Rhymes and Nursery Songs (1870).

Through education and self-knowledge...abolish war, poverty and disease...so said Bertrand Russell in 1960


Bertrand Russell [Wikipedia]

Bertrand Russell [Stanford Encyclopedia of Philosophy]


Sunday, July 28, 2013

Deceased--James P. Gordon

James P. Gordon
March 20th, 1928 to June 21st, 2013

Charles H. Townes [left], winner of the 1964 Nobel Prize in physics, and James P. Gordon in 1955 with the first maser

"James Gordon Dies at 85; Work Paved Way for Laser"

by

Douglas Martin

July 27th 2013

The New York Times

Distinguished Columbia University physicists, some of them Nobel Prize winners, called it a “harebrained scheme.” But James P. Gordon, principal builder of a refrigerator-size device that would help revolutionize modern life, believed in it enough to bet a bottle of bourbon that it would work.

He was a 25-year-old graduate student in December 1953 when he burst into the seminar room where Charles H. Townes, his mentor and the inventor of the device, was teaching. The device, he announced, had succeeded in emitting a narrow beam of intense microwave energy.

Dr. Townes’s team named it the maser, for microwave amplification by stimulated emission of radiation, and it would lead to the building of the first laser, which amplified light waves instead of microwaves and became essential to the birth of a new technological age. Lasers have found a wide range of practical applications, from long-distance telephone calls to eye surgery, from missile guidance systems to the checkout counter at the supermarket.

In 1964, Dr. Townes and two Russians, Nikolai G. Basov and Aleksandr M. Prokhorov, shared the Nobel Prize for Physics for the development of masers and lasers, the Russians having worked separately from Dr. Townes. Some thought Dr. Gordon, who died on June 21 at 85, deserved a share as well.

At the time of the maser’s invention, Dr. Townes credited it “to the triumph and glory” of Dr. Gordon.

“I worked on it with him,” he said years later, “but it was really Jim who made it work.”

Dr. Gordon handled much of the maser’s design work and was the lead author of the one-and-a-half-page paper announcing the achievement in the journal Physical Review in July 1954. He also gave the first talk about it to the American Physical Society.

At Dr. Gordon’s 80th birthday party, in 2008, Dr. Townes, who was 93 then, said one reason the Nobel committee did not recognize his younger colleague was its rule that no more than three people can be awarded any one prize.

He also suggested that Dr. Gordon might have been denied the prize because he was a student, although Dr. Basov was, too. Referring to the prize, Dr. Gordon once said, “It would have been too much too soon.”

Dr. Townes gave some of his prize money to Dr. Gordon, who used it to buy a Buick station wagon. Dr. Gordon won the bottle of bourbon from a young physicist in the department who, he later learned, had lost a similar bet with Dr. Townes, that one involving a bottle of Scotch.

Dr. Gordon, who lived in Rumson, N.J., died of complications of cancer in a Manhattan hospital, said his wife of 53 years, Susanna.

The maser is credited with inaugurating the field of quantum electronics, which uses a laser’s ability to shove around molecules and atoms in the development of electronic devices. Dr. Gordon went on to lead quantum electronics efforts at Bell Laboratories, for many years the world’s most innovative scientific organization. The projects ranged from tracing the universe’s origins to developing “optical tweezers” to manipulate atoms.

The research yielded many benefits. When Dr. Gordon was made an honorary member of the Optical Society in 2010, the organization’s president, James C. Wyatt, said, “His work has led to countless application areas, especially optical communications — the backbone of high-speed Internet today.”

Linn Mollenauer, who worked with Dr. Gordon at Bell on laser advances that helped clarify long-distance telephone conversation, said Dr. Gordon tended to wait for his colleagues to bring him seemingly insoluble problems that had emerged from experiments.

“He would listen very patiently and carefully,” Dr. Mollenauer said in an interview. “And then he would go away, and a few days later he would come by and present us with a few pages of a beautifully written theoretical model.”

Steven Chu, who before becoming secretary of energy in the Obama administration won a Nobel in 1997 for his work on cooling and trapping atoms as a Bell scientist, mentioned Dr. Gordon in his Nobel lecture. He said that when he once asked a colleague a sophisticated physics question, the colleague replied, “Only Jim Gordon really understands the dipole force.”

But Dr. Gordon was the first to admit that nothing important in science is done in isolation, and that success always comes in the wake of others’ discoveries. Just as thinking by Einstein, Bohr and Planck laid theoretical groundwork for quantum electronics — and World War II prompted the development of technology to make it work — Dr. Gordon helped lay the foundation for a field honored by eight subsequent Nobel Prizes to 18 individuals.

James Power Gordon, the son of a corporate lawyer, was born in Brooklyn on March 20, 1928, and grew up in Forest Hills, Queens, and Scarsdale, N.Y. He graduated from Phillips Exeter Academy and the Massachusetts Institute of Technology. After M.I.T. turned down his application for graduate school — “fortunately, as it turned out,” he wrote — he went to Columbia.

In mid-1951, Dr. Townes asked Dr. Gordon to join his microwave project. Though he worried that it would hurt his doctoral dissertation if the maser did not work, Dr. Gordon said yes.

The students on the team built the maser with their own hands and encountered endless practical problems. One was that the vacuum chamber enclosing the maser sprang leaks.

“We even had names for the leaks,” Dr. Gordon wrote in Optics & Photonics News in 2010. “There was the necktie leak, where your tie gets sucked into the box.”

Even after it was proved to work and named, Dr. Gordon wrote, some of his Columbia colleagues questioned its usefulness. One said the maser actually stood for “money acquisition scheme for expensive research.”

But by 1955, newspapers were reporting that the maser could be used to keep time so well that in 300 years it would be off by less than a second. A headline in The New York Times said, “New Clock Joins World’s Wonders.” In 1958, The Boston Globe reported that the maser was being adapted for military use by 20 universities and electronics firms. One use was to sense missile launchings as far as 4,000 miles away.

But its largest significance was in paving the way for the laser. In 1958, Dr. Townes joined with Arthur L. Schawlow to write a paper, “Infrared and Optical Masers,” that described a device to produce laser light. They received a patent for it. A Columbia graduate student, R. Gordon Gould, came up with insights on how to build it and named it. Theodore H. Maiman, a physicist with Hughes Aircraft in California, built the first operational laser in 1960.

Besides his wife, the former Susanna Poythress Bland Waldner, Dr. Gordon is survived by a son, James Jr.; two daughters, Sara Bolling Gordon and Susanna Gordon; and four grandchildren. A brother, Dr. Robert S. Gordon Jr., who coordinated AIDS research at the National Institutes of Health, died in 1985.

Dr. Gordon, who won tournaments in platform tennis, liked to wear his raccoon coat, sit on his front porch and smoke a Meerschaum pipe. One time, his wife recalled, his hair was mussed, and she asked him, “Who do you think you are, Einstein?”

“I’m closer than most people,” he answered.


"Optics and Photonics Pioneer James Gordon Dies"

June 26th, 2013

photonics.com

James P. Gordon, co-inventor of the maser and a seminal contributor to optics and quantum electronics, died June 21. He was 85.

Gordon was born in New York City in 1928. He attended Exeter Academy and received a bachelor's degree from MIT in 1949. He received his master's and PhD degrees in physics from Columbia University in 1951 and 1955, respectively.

In 1954, as a student of Charles Hard Townes at Columbia, Gordon analyzed, designed, built and successfully demonstrated the maser (microwave amplification by stimulated emission of radiation) with Townes and Herbert Zeiger. Their ammonia maser, based on Einstein's principle of stimulated emission, laid the groundwork for the creation of the laser.

In 1955, Gordon joined AT&T Bell Laboratories, where he served as head of the Quantum Electronics Research Department from 1958 to 1980. He spent his entire career at AT&T Bell Labs, retiring in 1996. His colleagues often sought out Gordon for help.

Linn Mollenauer, who worked with Gordon at Bell Labs and who co-authored the book “Solitons in Optical Fibers: Fundamentals and Applications” with him in 2006, told the Asbury Park Press: “Various experimentalists would come to him with problems that they couldn’t understand. Jim would ever-so-politely listen, and then a few days later, would come around with a beautiful theory written out for whatever their problem was. He was just definitely one of the greatest.”
Gordon's other contributions laid the foundation for what would become the fields of lasers and optical communications. He conceived and provided the theory (with Gary Boyd) of confocal resonators, fundamental for the modern analysis of Gaussian laser beams and optical cavities that are critical to the design and operation of lasers. He also made several contributions to optical communications, including pioneering the quantum theory of the information capacity of an optical communications channel, observing soliton propagation in optical fibers for the first time and work related to the fundamental limits of coherent optical transmission systems, among many others. His broad interests also included providing the theoretical basis for optical tweezers.

As the optical communications field evolved, Gordon continued to do research that provided key knowledge and insight that was critical both to fellow researchers and to ultimately deployed systems. His seminal work on what is now called the “Gordon-Haus” effect, identifies and provides the understanding for the most important bit-rate-limiting effect in soliton transmission due to the random walk of coherently amplified solitons. He provided other insights, including the explanation of the soliton self-frequency shift.

"Jim's contributions to optics and photonics, beginning in the 1950s with his co-invention of the maser, were crucial in shaping several areas of the field as we know them today — including quantum electronics, laser science and optical communications," said OSA CEO Elizabeth Rogan. "When Jim joined us in 2010 for the LaserFest gala celebrating the 50th anniversary of the laser, it gave us an opportunity to celebrate his legacy as one of the pioneers in modern optics and photonics. We were thrilled to have him there. He will be missed by all who knew him, and we send our deepest condolences to his family and loved ones."

Gordon's many honors include four OSA awards: the Charles Hard Townes Award (1981), the Max Born Award (1991), the Willis E. Lamb Award (2001) and the Frederic Ives Medal (2002). He was a member of both the US National Academy of Engineering and the US National Academy of Sciences, as well as a senior member of IEEE and a Fellow of OSA and the American Physical Society. He was named an OSA honorary member, OSA's highest honor, in 2010.

Gordon is survived by his wife, Susie, a former Bell Labs computer programmer, and their three children: James P. Gordon, Susanna Gordon and Sara Gordon.

“He did most of his work with a pad and a pencil and sat there until about 11 at night with complex mathematics and symbols, and if you’d look over his shoulder, you’d wonder what language it was in,” Susanna Gordon told the Asbury Park Press. “Sometimes he’d walk out and say, ‘I did it,’ and I’d go, ‘It sounds very important,’ and it was.”


James P. Gordon [Wikipedia]

Maser [Wikipedia]

Friday, July 26, 2013

Tragic and spectacular video and predictable physics


"The Physics of High-Speed Trains"

by

Patrick Di Justo

July 25th, 2013

The New Yorker

On Wednesday evening, a train travelling from Madrid to Ferrol, in northwestern Spain, derailed just as it was about to enter the Santiago de Compostela station. At least seventy-eight people were killed, and dozens were injured. Video of the accident shows the train entering the curve at what seems to be a high speed; the passenger cars detach from the engine and derail, while the engine stays on the tracks for a few more seconds before it, too, leaves the rails and hits a wall. Unofficial reports claim that the train was going as fast as a hundred and twenty miles per hour on track rated for only fifty m.p.h.

Unlike Japan’s Shinkansen or France’s T.G.V., which run on dedicated tracks, the Madrid-Ferrol route is a hybrid line, much like Amtrak’s Acela Express. Only part of the track is configured for high-speed travel; the rest is shared with slower trains, and can handle only their more restricted speeds.

High-speed rail is a catchall term with several definitions. The Federal Railroad Administration says it starts at a hundred and ten m.p.h., while the International Union of Railways says a hundred and fifty-five. But whichever definition one favors, the rails themselves must be carefully designed to handle the physical forces imposed upon them by multi-ton trains moving at high velocity.

One of those forces is centrifugal (“to flee from the center”) force, the inertia that makes a body on a curved path want to continue outward in a straight line. It’s what keeps passengers in their seats on a looping roller coaster and throws unsecured kids off carousels. Centrifugal force is a function of the square of the train’s velocity divided by the radius of the curve; the smaller and tighter the curve, or the faster the train, the greater the centrifugal force. As it increases, more and more of the weight of the train is transferred to the wheels on the outermost edge of the track, something even the best-built trains have trouble coping with. That’s where the concepts of minimum curve radius and super-elevation, or banking, come in.

Banked curves, in which the outer edge of the track is higher than the inner edge, balance the load on the train’s suspension. Since gravity pulls a train downward and centrifugal force pulls it outward, a track banked at just the right angle can spread the forces more evenly between a train’s inner and outer wheels, and help to keep it on the track.

But banking the tracks isn’t a cure-all—a passenger train can tilt only so far before people fall out of their seats. So the minimum curve radius comes into play. Imagine that a curved portion of track is actually running along the outer edge of a large circle. How big must that circle be to insure that a train’s centrifugal force can be managed with only a reasonable amount of banking?

It’s relatively easy to calculate these forces and the ways to counteract them, so it’s relatively easy to set a safe maximum speed for a certain kind of track. Yes, badly maintained tracks, trains, or signals can sometimes contribute to a derailment. Historically, however, many of the world’s worst train accidents on sharp curves—the 1918 Malbone Street wreck in the New York City subway system, which killed at least ninety-three people (figures vary), or the Metro derailment in Valencia, Spain, in 2006, which killed forty-three—were simply caused by the trains going too fast.

That seems to be the case in the Santiago de Compostela accident: tracks rated for fifty miles per hour need almost no banking and can have a curve radius of fifteen hundred feet, while a train traveling at a hundred and twenty miles per hour needs a track with significant banking, and a minimum curve radius of more than a mile and a half. The laws of physics all but insured that in this particular battle between gravity and centrifugal force, the latter would win.

Syncom communications satellite...50 years ago


"Hughes engineer Harold Rosen's team overcame technical and political hurdles to send the Syncom communications satellite into orbit 50 years ago."

by

Ralph Vartabedian

July 26, 2013

The Los Angeles Times

In the fall of 1957, the Soviet Union's newly launched Sputnik satellite would regularly streak across the Los Angeles sky, a bright dot in the black night.

All it could do was broadcast beeps back to Earth, but the technical achievement by the communists had stunned America. Perhaps nobody was more taken aback than a group of engineers and scientists at the defense electronics laboratories of Hughes Aircraft in Culver City.

They would trudge up a fire escape to the roof and watch the satellite with a mix of astonishment, excitement, envy and fear. Among them was Harold Rosen, a young doctorate engineer from Caltech, who while he watched Sputnik was hatching an audacious plan to eclipse the Russians.

What he imagined by 1959 was a revolution in communications: an extremely lightweight, solar-powered telephone switching station in orbit 22,000 miles above Earth. In those days, an international telephone connection required making a reservation because the existing system — copper cables and radio signals — carried few calls. Many countries could not be called at all. A satellite could change all of that.

Rosen recruited two other engineers, Thomas Hudspeth and Don Williams, and began designing the electronics and the propulsion and power system needed for a communications satellite. Not only was the task technically tough, but they also were fighting many of the nation's top experts who did not believe their idea would work. Even their bosses — at a company founded by the eccentric billionaire Howard Hughes — were not sure their project was worth a modest investment.

"I considered it me against the world," Rosen said about the initial lack of government and industry support.

Inside their labs on Centinela Avenue, the men pushed the technology ahead at blinding speed and found key allies in government who were willing to bet on a trio of unknown engineers.

On July 26, 1963 — exactly 50 years ago — they launched a 78-pound satellite called Syncom that could receive signals from Earth and then transmit them back across the globe.

Of all the technological breakthroughs made in Los Angeles during the Cold War — the laser, the first supersonic jet fighter, the Apollo moon ship, stealth aircraft, the space shuttle, the intercontinental ballistic missile system and much else — the creation of a communications satellite has had the largest and most enduring cultural, social and economic impact.

The little Syncom has morphed into communications satellites the size of school buses, weighing more than 13,000 pounds, operating with solar wings the length of a basketball court and running electronics with more power than a typical house wired to the electrical grid.

Electronic credit card authorizations, international television signals, email and social media — all the things that define our modern connected culture — were not even imagined by the public in the 1950s and would not exist today in many areas of the world without communications satellites.

About 500 such satellites are orbiting Earth, allowing cruise ships to communicate with ports, music to be beamed down to radios and television shows to arrive in living rooms, all because of a technology nearly as unknown by the public as Rosen himself.

Since the Syncom satellite launch in 1963, hundreds of communications satellites are now in the skies above Earth in geosynchronous orbit. The locations of many modern-day commercial satellites are below.

In some instances, satellites are co-located at the same latitude. To accommodate this, co-located satellites have been moved further out in the graphic, so they do not appear overlapped.

Rosen is an athletic 87-year-old with a full head of sandy-colored hair. He's lived in the same Pacific Palisades home — with ceiling-to-floor windows that overlook a lush garden — for 60 years.

His parents emigrated from Montreal to New Orleans, where he was born. After studying engineering at Tulane University, he dithered over whether to continue his education at Harvard or Caltech. The decision was made when he saw a Life magazine story about beach parties in Southern California. He bought a train ticket.

"I came out on the Sunset Limited and never looked back," Rosen recalled. "I still love the beach."

By the time Rosen arrived at Hughes in 1956, the company was gaining prominence in the scientific world. He began designing airborne radar that could spot Russian bombers, but the Air Force canceled the program and his bosses challenged him to find something new.

When Rosen's team proposed Syncom, many of the nation's top experts, notably at Bell Labs, thought they were on the wrong track. Instead, Bell Labs and others were working on a large network of satellites in low Earth orbit that would require a complex system of ground tracking stations.

Rosen was confident that he could build a satellite to operate at 22,000 miles directly above the equator, which would allow it to remain stationary and provide continuous coverage over a third of the world. The problem was that American rockets of the 1960s lacked the power to launch heavy payloads to high orbits. Rosen would have to keep Syncom as light as possible, which became the key to its success.

Top Hughes executives were reluctant to invest in a prototype, even after Rosen, Williams and Hudspeth each offered to invest $10,000. Rosen went to government offices, universities and competing electronics companies to find encouragement and a financial partner.

After Raytheon Corp. offered Rosen and his team jobs and the chance to develop Syncom there, Hughes executives changed their minds and committed to a $2-million investment.

"It was a vindication for everything we had gone through," Rosen said.

Rosen pioneered the overall concept and design: The satellite would remain stable by spinning like a gyroscope, and a propulsion system would maintain its orbital position. The barrel-shaped spacecraft was covered by solar cells that supplied electrical power.

Williams came up with a key innovation: using a single lightweight rocket engine to control the spinning satellite's position with short bursts of thrust. The resulting Williams patent, by itself, yielded Hughes millions in royalties.

By 1961, they had built a working 55-pound prototype, which they took to the Paris Air Show and used to transmit photos across the room.

Don Williams, left, Thomas Hudspeth, center, and Harold Rosen designed the electronics, propulsion and power system for the Syncom communications satellite.

The trio still needed federal government support to build and launch an operational version, though. Help came from a former Hughes executive, John Rubel, who was deputy research director at the Defense Department.

Rubel was overseeing a troubled attempt by the Pentagon to build its own communications satellite. Virtually no hardware had been created and the projected weight was in the thousands of pounds, recalled Rubel, now 93.

A friend told him about Rosen. Rubel remembered, "He had this thing that weighed 55 pounds and it was immediately obvious to me that this was it, the solution to all of our problems."

He arranged a deal to allow NASA to fund the launch. The first attempt in early 1963 failed because of a rocket malfunction. But the second launch was successful.

Test signals to a Navy ship docked in Lagos, Nigeria, confirmed the satellite was working. In a later check of the system, Rosen handed the telephone to his wife, Rosetta, and a sergeant on the other end said hello. She dropped the phone and said, "My God, Harold, it works."

Rosen said he never doubted it would work. "We had overcome all these hurdles — all these political hurdles more than technical hurdles — and the way was clear," he said.

With Syncom, Hughes not only had beaten out every other corporation in a landmark achievement, but it also had started a technological revolution.

"We very quickly could feel that we had the world by the tail," said Robert Roney, 90, the Hughes research director who had hired Rosen. "We were way ahead of the curve. All of us felt like we were the luckiest people alive."

Albert Wheelon, who would later become chief of the Hughes satellite business, was at the time deputy director of the CIA. He remembered reading about the Syncom launch in a newspaper.

"I said this is really important for what we are doing at the agency,"
he said. "Instead of putting these listening posts around the Soviet Union, we could put one of these things up in the sky and listen to everything."

One day a few years after the Syncom success, Williams visited Rosen with something on his mind. He apologized for not including Rosen's name on the patent for the rocket control system. Rosen insisted no apology was necessary. (Rosen would eventually have his name on more than 50 patents, including the basic patent for Syncom.)

Later that day, Williams went home and killed himself. He was 34.

The third engineer on the team, Hudspeth, died in 2008 at the age of 89.

Rosen still works a couple of days a week on satellite systems at a Boeing office in El Segundo and sometimes gives lectures to young engineers at Caltech. On occasion, he exercises on Santa Monica beach with his wife, Deborah Castleman, a former satellite engineer and deputy assistant secretary of Defense during the Clinton administration. Rosen's first wife, Rosetta, died in 1969.

Rosen has a shelf full of medals, including the Charles Stark Draper Prize, considered the Nobel Prize of engineering, which he shared with his rival John Pierce, a Bell Labs expert who in the 1950s had advocated low-Earth-orbit satellites.

The satellite technology that Rosen, Williams and Hudspeth created is now a $190-billion-a-year industry. Boeing acquired Hughes' satellite business 13 years ago and still operates a sprawling manufacturing plant with 5,200 employees in El Segundo. It has an order backlog of 32 satellites, 17 of them commercial and the balance for defense, intelligence or space agencies.

Over the years, seven Hughes employees who worked on satellites or data transmission were admitted into the National Academy of Engineering. They included Rosen, Wheelon, Roney and Eddy Hartenstein, now publisher of the Los Angeles Times, who pioneered the technology for delivering satellite television directly into homes and then created DirecTV.

Several weeks after the Syncom launch, President Kennedy inaugurated international satellite telephone service to Nigeria, where the Navy had stationed its receivers. The symbolic phone call to Nigerian Prime Minister Abubakar Tafawa Balewa lasted two minutes.

Kennedy and Balewa traded pleasantries, briefly mentioned the nuclear weapons test ban treaty signed that year, and talked about a boxing match in which Nigerian middleweight boxer Dick Tiger had retained his title against an American.

The next year, 1964, the third Syncom satellite transmitted live coverage of the Summer Olympics from Japan, and Hughes Aircraft was on the way to dominating the commercial satellite industry.

 
Syncom [Wikipedia]

Wednesday, July 24, 2013

Alan Turing pardoned


"Father Of Artificial Intelligence To Be Pardoned For Being Gay"

by

George Dvorsky

July 23rd, 2013

io9

Back in 1952, mathematician and computer scientist Alan Turing was convicted for gross indecency — the standard criminal charge for homosexuality. After his chemical castration, he killed himself by eating an apple laced with cyanide. Now, over 60 years later, he's set to be pardoned.

It's hard to assess the impact of Alan Turing. Not only did he contribute to the Church-Turing thesis (the suggestion that any real-world computation — including cognition — can be translated into an equivalent computation involving a Turing machine) and the Turing Test, he also played in incalculable role in World War II by cracking the Nazi's Enigma encryption machine.

But because he was gay, the UK chose not to celebrate him — but rather, to terrorize him.

Two years ago, the British government officially apologized for its homophobia, but his conviction remained on the books. Now, the government has said it would not stand in the way of pending legislation that would offer a full parliamentary pardon for Turing.

The Telegraph reports:

    Last December Prof Stephen Hawking and other leading scientists wrote to The Daily Telegraph urging a pardon for Turing, whose work at Bletchley has been credited for hastening the end of the Second World War.

    Speaking in the House of Lords on Friday, Lord Ahmad of Wimbledon, a whip, said the Government would not stand in the way of a Bill brought by Liberal Democrat peer Lord Sharkey, which offers Turing a full posthumous parliamentary pardon.

    Speaking in the House of Lords shortly before the Alan Turing (Statutory Pardon) Bill received an unopposed second reading, Lord Ahmad said: “Alan Turing himself believed that homosexual activity would be made legal by a Royal Commission.

    “In fact, appropriately, it was Parliament which decriminalised the activity for which he was convicted.

    “The Government therefore is very aware of the cause to pardon Turing given his outstanding achievement and therefore has great sympathy with the objective of the Bill.

    “That is why the Government believes it is right that Parliament should be free to respond to this Bill in whatever way its conscience dictates, in whatever way Parliament so wills."


Of course, this is of little consolation to Turing, or the other 49,000 gay men who were convicted by the 1885 Criminal Law Amendment Act — a list of men that includes Oscar Wilde. Perhaps the bolder act would be to recognize those individuals as well.



Forgiveness for Alan Turing by Stephen Hawking and others

Lick 'em and stick 'em...some postal history


"The Story of the First Postage Stamp"

July 19th, 2013

Smithsonian.com

“Philately” (get your mind out of the gutter) is the proper term for the studying of stamps and stamp collecting. It was coined in 1865 by Georges Herpin, who very well may have been the first stamp collector, from the Ancient Greek ???? (philo), meaning “love of” and ??????? (ateli-a), meaning “without tax.” Of course, because the ancient Greeks didn’t have postage stamps, there was no proper Greek word for the idea. But, as we shall see, the term is actually a reference to the earliest days of paid postage.

Postage can reveal more than the history of a letter, it can reveal the history of a nation. As noted by the National Postal Museum, which celebrates its 20th anniversary this month, “every stamp tells a story”—and, I might add, it sometimes tells how the story should be told (fat Elvis or skinny Elvis?).

The forthcoming book A History of Britain in Thirty-Six Postage Stamps by Chris West tells the story of the stamp. And of Britain. West is himself a philatelist (seriously stop snickering) who inherited a collection from his uncle that included a “Penny Black”—the first postage stamp issued in Britain and, more importantly, the first postage stamp issued anywhere.

The Penny Black bears the image of Queen Victoria, but the first British postal service did not originate in Victorian England. In 1680 an entrepreneur by the name of William Dockwra started a public service that guaranteed the quick delivery of a letter anywhere in London. His system was quickly nationalized with Dockwra in charge. It was far from a perfect system, burdened with seemingly erroneous charges and tariffs that made it unreasonably expensive to send a letter. Worse still, recipients were expected to pay. As you might imagine, this presented some problems—either people weren’t home or flat-out refused to pay. Not to mention the blatant corruption. The system just didn’t work, but it remained in place for far too long.

About 50 years later, an ambitious polymath named Rowland Hill thought he could do better. Hill ran a progressive school, for which he also designed a central heating system, a swimming pool and an observatory. Hill’s skills weren’t just architectural and pedagogical, he was also an accomplished painter, inventor and essayist. In one of his most famous pamphlets, Post Office Reform, its Importance and Practicability, Hill argued for abolishing the postal tariffs and replacing them with a single national rate of one penny, which would be paid by the sender.

When the post office ignored Hill’s ideas, he self-published his essay and it quickly gained ground among the public. Hill was then summoned by Postmaster General Lord Lichfield to discuss postal reform and, during their subsequent meeting, the two men conceived of an adhesive label that could be applied to envelopes to indicate payment. Though it had gained momentum with the public who longed for an affordable way to connect with distant friends and family, officials still were’t convinced, calling it “extraordinary” (in a bad way) and “preposterous,” and probably saying things like “crikey!” and “I say!” and “what hufflepuffery!” and other such exclamations popular among the blustery Victorian bureaucrat set. Thankfully, Hill was far from alone in his passion for reform. He eventually earned enough support from other like-minded individuals, like Henry Cole, founding director of the Victoria and Albert Museum, as well as larger, powerful organizations, to convince Parliament to implement his system.

In 1839, Hill held a competition to design all the necessary postal paraphernalia. The winning stamp entry depicting the young Queen’s profile came from one William Wyon, who based the design on a medal he created to celebrate her first visit to London earlier that year. Hill worked with artist Henry Corbould to refine the portrait and develop the stamp’s intricate background pattern. After deciding to produce the stamps through line engraving, engravers George Rushall and Charles and Frederick Heath prepared the design for printing.

The “penny black” stamp went on sale May 1, 1840. It was an immediate hit. Suddenly, the country seemed a lot smaller. Over the next year, 70 million letters were sent. Two years later, the number had more than tripled. Other countries soon followed suit. The Penny Black’s design was so well received, it remained in use for forty years, though, as the National Postal Museum notes, “it underwent color changes (1841), adopted perforations (1848), and acquired check letters in all four corners (1858)…and most of those designs were retained for Victoria’s successor, Edward VII, (1901) with his profile being substituted.”

The National Postal Museum also shares some insight into why we put stamps on the upper right corner of envelopes. The answer is refreshingly utilitarian: the location of the stamp was decided because over 80 percent of London’s male population was right-handed and it was believed this would help expedite the postmarking/cancellation process.

“Stamps can be a good way of establishing a ‘national brand,’” says West. Indeed, a nation’s stamps express the identity and the ambitions of a country. Few countries understood this better than Czechoslovakia, whose government hired noted artist and graphic designer Alphonse Mucha to design its stamps—as well as its money, and almost every other official piece of paper—when the country gained its independence after World War I. West cites other examples, noting how Germany, after World War II, focused on the country’s positive contribution to European culture, while modern America illustrates its history, diversity and individual achievement with its numerous stamps celebrating famous artists and innovators.

A History of Britain in Thirty-Six Postage Stamps lives up to its title. Though stamps may be the subject of the book, its content is full of insight into the full history of the British Empire, from Queen Victoria to Kate Middleton. Through West’s book, we get fascinating stories and anecdotes about wars, celebrations, the mercurial fortunes of Britain’s royalty, the rise and fall of its empire and, of course, design. All told a penny at a time.


Penny Black [Wikipedia]




A History of Britain in Thirty-six Postage Stamps

by

Chris West

ISBN-10: 1250035503
ISBN-13: 978-1250035509

Monday, July 22, 2013

Printed books aren't dead


"Authors stand up for traditional books over e-books"

Just a few years after the launch of the Kindle, old fashioned books are making a comeback as authors promote the joy of bookshelves and well thumbed pages over the e-book.

by

Louise Gray

July 21st, 2013

The Telegraph

 A year ago Amazon reported its Kindle e-books were outstripping its sale of printed books.

However summer reading lists this year show that most authors prefer a proper old-fashioned book to touch screens.

Writers prefer a well stuffed book shelf to one slim tablet and admire well illustrated book over a touch-screen. “Ugly adverts” appearing on screens is also a problem for authors.

Alain de Botton, the philosopher, said he soon dumped e-books when he realised the information didn’t really sink in without physical contact with a real book.

"I'm a recent apostate from e-books. I found that whatever I read on my Kindle I couldn't really remember in the long term. It was as if I had never read it," he told the Sunday Times.

Jilly Cooper missed the ability to makes notes on e-book readers in the same way as with traditional books.

"I like to scribble all over [books] and write things and say 'Well done' and 'God how awful' and 'Let's remember that bit'. I always underline good bits and turn over the pages of bits that absolutely knock me out," she said.

Amazon launched the UK store for its Kindle e-book reader three years ago and a frenzy of e-book-buying followed. In 2011 sales grew by 366 per cent, and they doubled again last year, when 65m e-books were sold in the UK, making up 17 per cent of the total book sales market.

The Fifty Shades erotic trilogy by EL James has contributed much to the phenomenon, filling the top three slots in last year's e-book sales charts.

Philip Stone of The Bookseller said growth is slowing.

"After the initial dramatic sales, we've seen enthusiasm wane more recently," he told the Sunday Times. "The format is certainly here to stay but we are expecting sales will only increase by around 20 per cet in total this year."

Mr Stone said the bestselling genres on e-book were those such as thrillers, which are commonly read in a hurry.

Friday, July 19, 2013

First female rocket scientist...Mary Sherman Morgan


"Remembering the US's first female rocket scientist"

July 17th, 2013

BBC World News

In 1957, the space race was heating up.

After the Soviet Union launched Sputnik into orbit, the United States was desperate to catch up. Dr Wernher von Braun and other top engineers struggled to find a solution to the US rocket programme's failures, so they turned to Mary Sherman Morgan.

In the early 1950s, Morgan was the only female analyst among 900 rocket scientists at North American Aviation. She was also one of the few without a college degree.

While von Braun became a familiar figure in newspapers and on television screens across the country, Mary Sherman Morgan's story is more obscure.

In his book, Rocket Girl, George D Morgan tells the story of his mother's journey from North Dakota farm girl to brilliant scientist whose obscure, yet crucial, contributions to the development of a new rocket fuel powered the country's first satellite, Explorer 1. 



Mary Sherman Morgan [Wikipedia]


Rocket Girl: The Story of Mary Sherman Morgan, America's First Female Rocket Scientist

by

George D. Morgan

ASIN: B00BH0VPS8

Pitch Drop Experiment...in the limelight


"Longest Running Laboratory Experiment Keeps on Going, Ever So Slowly"

June 18th, 2013

Slate

To view the experiment that the University of Queensland's School of Maths and Physics boasts is "more exciting than watching grass grow," you'll need to go to the display cabinet at the school's foyer. There, beneath a glass dome, you will see a funnel filled with asphalt. It doesn't seem to be doing anything other than sitting there, but do not be deceived: You are looking at the world's longest continuously running lab experiment in action.

The Pitch Drop Experiment began in 1927, the brainchild of UQ physics professor Thomas Parnell. His aim: to demonstrate that pitch—a term that includes substances such as asphalt—is not solid, but a highly viscous liquid.

To prove this, Parnell poured a heated sample of pitch into a closed funnel and let it settle, a process that took three years. In 1930, he cut off the end of the funnel, allowing the pitch to flow freely. Which it did. Very, very slowly. The first drop fell into the beaker in 1938, with the second and third drops following in 1947 and 1954. With the installation of air-conditioning in the building came a reduction in the flow rate—the eighth drop, which hit the beaker in November 2000, took 12 years and four months to fall.

Parnell died in 1948, but his experiment has since been passed onto other custodians from the university. He never saw a drop fall, but neither has anyone else—the asphalt has an exasperating tendency to drip when no one is looking. A live-streaming webcam trained on the experiment during the 2000 drop malfunctioned. The next pitch drop is expected to fall sometime during 2013.


"The Single Drip Scientists Have Been Waiting the Better Part of a Century for Just Dropped"

by

Josh Voorhees

July 18th, 2013

Slate

Science lovers are aflutter today over the above video, which may not look like much—OK, definitely doesn't look like much—until I tell you some people have been waiting for the better part of a century to lay eyes on the long-elusive "pitch drop" captured in the time-lapse footage you just watched with almost infinitely less of a time investment.

I won't pretend like I was even mildly aware of this until a more science-savvy colleague flagged it for me this afternoon, so I'll let Nature explain the back story behind the simple experiment that's developed its own cult-following thanks to a series of absurdly near misses in the past:

    The Dublin pitch-drop experiment was set up in 1944 at Trinity College Dublin to demonstrate the high viscosity or low fluidity of pitch — also known as bitumen or asphalt — a material that appears to be solid at room temperature, but is in fact flowing, albeit extremely slowly.
 
    It is a younger and less well-known sibling of an experiment that has been running since 1927 at the University of Queensland in Brisbane, which Guinness World Records lists as the world’s longest-running laboratory experiment. Physicist Thomas Parnell set it up because he wanted to illustrate that everyday materials can exhibit surprising properties.

OK, still a little fourth-period-science-class boring, I admit. But where things get good is when you learn about all the close calls suffered by researchers during the years before technology allowed them to basically just DVR the whole thing. Megan Garber, who does a truly great job capturing the excitement around the drop heard 'round the science world, details some of the more painful misses over at the Atlantic.  Those misses, in turn, were all the more painful because on average a typical professional pitch-drop experiment delivers the goods about once per decade, something that played a large part in explaining why no one had ever seen a drop before. So when you step out for 15 minutes to grab tea, as a Queensland professor did once, you're probably never going to be able to enjoy Earl Grey again after you come back to discover you managed to miss something you've been holding your breath for for the past 10 years.



Pitch Drop Experiment

Wednesday, July 17, 2013

Deceased--Chuck Foley



Chuck Foley
September 6th, 1930 to July 1st, 2013

"Chuck Foley, Co-Creator of the Game Twister, Dies at 82"

by

Margalit Fox

July 11th, 2013

The New York Times

Chuck Foley, a game designer who helped inaugurate a craze that rivaled the hula hoop, scandalized the puritanical and drove chiropractors wild with delight, died on July 1 in St. Louis Park, Minn. Mr. Foley, an inventor of Twister, was 82.

The cause was complications of Alzheimer’s disease, his son Mark said.

Mr. Foley and a colleague, Neil Rabens, developed the game, known on the drawing board as Pretzel, for a St. Paul design concern in the mid-1960s. Originally manufactured by Milton Bradley, Twister was introduced in 1966 and has gone on to sell tens of millions of copies.

In 1969, Mr. Foley and Mr. Rabens were awarded United States Patent 3,454,279 for their invention, “Apparatus for Playing a Game Wherein the Players Constitute the Game Pieces.”

Currently made by Hasbro, Twister is inextricably knotted into late-20th-century popular culture. In a memorable sendup of the chess-playing scene in Ingmar Bergman’s “Seventh Seal,” for instance, the 1991 film “Bill & Ted’s Bogus Journey” features its young heroes (Keanu Reeves and Alex Winter) playing Twister with Death. Death loses.

The game’s premise is simple, involving two or more players, sock feet, a spinner and a vinyl mat emblazoned with dots of red, blue, yellow and green. As the spinner is spun, players enact its instructions (“Right hand, yellow”; “Left foot, green”), forming a complex, suggestive architecture for which human limbs are inadequately evolved. To fall is to be eliminated.

Visually, Twister marries Alexander Calder sculpture with the hokey pokey and the Kama Sutra — the last point brought unmistakably home on “The Tonight Show” in 1966, when Johnny Carson and a low-necklined Eva Gabor played the game on camera, sending sales soaring.

Charles Fredrick Foley II was born on Sept. 6, 1930, in Lafayette, Ind. Before he was out of grade school, he invented an automatic latching mechanism for the cattle pen on his grandfather’s farm. As a young man, he worked for a research and development concern in Detroit, where he helped perfect an automatic cocktail shaker.

In the 1960s, he joined Reynolds Guyer House of Design, the St. Paul company at which he and Mr. Rabens, an artist, developed Twister. Mr. Foley received no royalties for the game, his son said, eventually accepting about $27,000 in a negotiated buyout.

In retirement a resident of Charlotte, N.C., Mr. Foley lived most recently in Minneapolis. His other inventions include toy handcuffs and Un-Du, a liquid that removes adhesives.

Mr. Foley’s first wife, the former Kathleen Maley, died in 1975; his second wife, the former Melonie Reece, died in 2007. Besides his son Mark, his survivors include two brothers, Mike and Bernard; two sisters, Veronica Lewis and Carolyn Walker; five other sons, Chuck, Kevin, Brian, Pat and Mike; three daughters, Mary Kay Foley, Kerin Logstrom and Katie Foley; 16 grandchildren; and four great-grandchildren.

Mr. Rabens also survives.

When Twister made its debut, Milton Bradley’s competitors accused the company of selling “sex in a box.” But that, Mr. Foley said, was practically beside the point.

“Once you get men and women in play positions, unless you’re drinking, you forget the sex thing,” he told an interviewer in 1994. “The urge to win takes over.”

Victorian space observatory restored





Daughter Thereza Llewelyn
 
John Dillwyn Llewelyn

"Victorian space observatory where earliest pictures of the moon were taken is saved for the nation after falling into disrepair"

Observatory on Penllergare estate was crumbling after years of neglect
Pioneer John Dillwyn Llewelyn took early pictures of moon from the site
It has now been saved by a £2.9m Lottery Heritage Fund restoration project

by

James Rush

July 9th, 2013

DailyMail

A Victorian space observatory where some of the world’s earliest photographs of the moon were taken has been saved.

Photographic pioneer John Dillwyn Llewelyn took early pictures of the moonscape in 1857 from the observatory.

The 'nationally important' building on the Penllergare estate, near Swansea, South Wales, was crumbling after years of neglect.

But it has now been saved by a £2.9m Lottery Heritage Fund restoration project.

The astronomical observatory was built in 1846 by wealthy industrialist and renaissance man John Dillwyn Llewelyn - a founder member of the Royal Photographic Society.

After installing a telescope in the building, he and his daughter Thereza captured some of the earliest photographs of the moon.

Llewelyn was a noted botanist, scientist, abolitionist and a health campaigner as well as a world-renowned photographer.

Queen Victoria and Prince Albert bought many of his pioneering works to add to their growing photographic collection.

His invention of the Oxymel process - the use of honey and vinegar to instantly preserve images - in the 1850s meant landscape photographers could do away with cumbersome portable laboratories and darkroom tents.

Llewelyn’s industrialist family owned the Cambrian pottery in Swansea, South Wales, and he inherited Penllergare house and estate in 1833.

He died in 1882, aged 72, and by the 1930s the house and estate began a period of decline.

For more than half a century the Penllergare estate was almost forgotten and the mansion was destroyed and replaced by council offices.

But now the historic observatory and surviving estate woodlands have been saved for the nation by a £2.9m Heritage Lottery Fund grant.

Penllergare Trust chairman Terry Jones said: 'The observatory is the scientific legacy of John Dillwyn Llewelyn.

'It has survived over half a century of neglect along with the picturesque designed landscape garden in the valley below.

'We’re looking forward to bringing this nationally important historic structure back into good condition and conserving it for public benefit.'


John Dillwyn Llewelyn [Wikipedia]

Penllergare [Wikipedia]

Puzzling snap


I am guessing. Is this Zooey Deschanel from The Hitchhiker's Guide to the Galaxy?

What a waste...chemist convicted of murder


"Chemist found guilty of murder"

by

Laura Howes

July 17th, 2013

Chemistry World

A chemist in the US has been found guilty of killing her husband with thallium obtained from the Bristol-Myers Squibb (BMS) lab where she worked. Tianle Li, 43, was also found guilty of hindering her prosecution after originally claiming that she did not have access to thallium at BMS.

During the trial, testimony revealed that Li had obtained four bottles of a thallium salt from BMS in November 2010, before returning three later that year. In the US the general public cannot purchase thallium but it can be useful in the chemistry lab, for example its salts can help promote cross-coupling reactions. However, thallium is also known as a perfect poison – odourless, colourless and tasteless, and less than a gram can be deadly.

On the 14 January 2011, Li’s husband Xiaoye Wang checked himself into University Medical Centre in Princeton with flu-like symptoms. It was not until several days later that thallium poisoning was suspected and a urine sample sent for testing. To do that the sample had to be transferred to a lab in another state.

The actual test for thallium is quite simple, says Barry Samson, director of the trace element laboratory at Charing Cross Hospital in London, UK, explaining that he uses inductively-coupled mass spectrometry on blood or urine samples. However, he adds, not all labs have the equipment and although, in this case, thallium poisoning was suspected, checks for the metal will not always be carried out in scans for heavy metals if there is no suspicion that the patient has come into contact with the element.

Wang died in hospital on 26 January 2011, just one day after doctors received the results of his tests. Curtis Rykal, a lab technologist at the Mayo Clinic in Minnesota who tested Wang’s urine for thallium, told the court that the results ‘were higher than our highest reportable amount. It was off the charts, I have never seen results that high.’

Thallium is so poisonous because the body confuses it with potassium and so interferes with numerous processes, causing nerve damage. Thallium poisoning can be treated with Prussian blue, however in Wang’s case staff at the hospital were not able to get any to the hospital before he died.

Tianle Li will be sentenced later this year and is facing life in prison for the crime. BMS were approached by Chemistry World for comment but they declined due to pending civil litigation related to the case.



Thallium [Wikipedia]