Friday, February 27, 2015

Sharing Your Personal DNA Data Can Help Others

Personal Genomics

A Human Genome: This is the digital representation of the human genome, which a museum visitor in New York City is viewing in 2001. We have come a long way since June 2000, when the Human Genome Project successfully presented a working draft of the human genome. The cost of having your genome sequenced has dropped considerably since then. Personal genomics will lead to personalized medicine and made-to-measure drugs to treat a host of diseases, including cancer.
Photo Credit: Mario Tama, Getty Images

An article (“The Internet of DNA”), by Antonio Regalado, in MIT Technology Review looks at the issue of personal genomics—the sequencing of an individual’s DNA— and the benefits of sharing it to a large network like the Internet in the hopes of helping others with similar diseases. This of course raises issues of privacy and possible breaches of security, but it also raises issues of human cooperation and the multiplying power of sharing information and knowledge.

In many ways, it is defines the modern battle between personal privacy and public pursuit of knowledge. What has to be decided is whether society at large is ready to sacrifice individual privacy to increase knowledge and better humanity. Your medical treatment, notably if successful, could benefit millions of others around the world. This is the crux of the argument.

Regalado writes:
 In January, programmers in Toronto began testing a system for trading genetic information with other hospitals. These facilities, in locations including Miami, Baltimore, and Cambridge, U.K., also treat children with so-called ­Mendelian disorders, which are caused by a rare mutation in a single gene. The system, called MatchMaker Exchange, represents something new: a way to automate the comparison of DNA from sick people around the world.

One of the people behind this project is David Haussler, a bioinformatics expert based at the University of California, Santa Cruz. The problem Haussler is grappling with now is that genome sequencing is largely detached from our greatest tool for sharing information: the Internet. That’s unfortunate because more than 200,000 people have already had their genomes sequenced, a number certain to rise into the millions in years ahead. The next era of medicine depends on large-scale comparisons of these genomes, a task for which he thinks scientists are poorly prepared. “I can use my credit card anywhere in the world, but biomedical data just isn’t on the Internet,” he says. “It’s all incomplete and locked down.” Genomes often get moved around in hard drives and delivered by FedEx trucks.

Haussler is a founder and one of the technical leaders of the Global Alliance for Genomics and Health, a nonprofit organization formed in 2013 that compares itself to the W3C, the standards organization devoted to making sure the Web functions correctly. Also known by its unwieldy acronym, GA4GH, it’s gained a large membership, including major technology companies like Google. Its products so far include protocols, application programming interfaces (APIs), and improved file formats for moving DNA around the Web. But the real problems it is solving are mostly not technical. Instead, they are sociological: scientists are reluctant to share genetic data, and because of privacy rules, it’s considered legally risky to put people’s genomes on the Internet.
I for one would be willing to share my biomedical data, and have said so in my cancer blog. This puts me in good company; Steven Pinker had his genome sequenced years ago and allowed the information to be posted on the Internet, he wrote in an article for The New York Times (“My Genome, My Self ”; January 7, 2009).

I have not had my genome sequenced, but if it could be done for free, I would willingly publicly share my biomedical information, not only for the sake of science, but more important for the sake of others. One of the signs of a highly developed and morally healthy society is a willingness to help others, altruism at its finest, and a willingness to better the human condition.

The perceived loss of privacy is really of no or little consequence when compared to the benefit of helping many other persons. Even so, there are obstacles of the mind. Privacy advocates have been beating the drums of privacy for decades, and have been successful in creating a climate of fear on such matters. [The question to ask is: Why is personal privacy so important?] For this reason, this transition to open sharing will not be an easy one, chiefly because people have been told not to share, and, moreover, have become reticent, reluctant and spooked about sharing any personal information.

For the sharing of biomedical data to become feasible, it will require that governments ensure that such information cannot be used by, say, insurance companies to deny coverage to individuals deemed  statistically “at risk” of a disease or condition they currently do not have; it will also require that societies rethink personal privacy and why it’s important, and most of all it will require a shift away from the idea that protecting individual privacy is more important than sharing personal information with the purpose of bettering humanity.

***************
For more, go to [MITTechReview]

Monday, February 23, 2015

Today's Cancer Is Not Yesterday's Cancer

Medical Advances & The Human Spirit

I watched Terms of Endearment a couple of days ago, a wonderful American film that I had not seem since it was first released to the public in 1983; it is about relationships and the strength of women who hold them together. Much of the film is focused on the relationship between mother (played by Shirley MacLaine) and daughter (played by Debra Winger). The relationships that each have with the men in their lives are important but take on a secondary or complementary role. Jack Nicholson and Jeff Daniels play the men.

There is a sense that relationships are both fragile and wonderful, and that when tragedy strikes, reconciliation is both possible and necessary; one of the tragedies in this film is that the daughter is diagnosed with cancer, and it is terminal. The daughter soon succumbs to the disease; the scene between her and her two boys is especially touching.

The movie spoke to me in many ways, one of which was that a diagnosis of cancer in the 1980s was often a ”death sentence.” But no so today; medical research has not only greatly increased our understanding of cancer and its mechanisms. but it has also lead to treatments that can prolong life without the compromising life-draining side effects of the past. In short, we have learned a lot more about cancer, and how to treat and beat it.

The movie brings to memory my family’s situation, and what my mother and my two brothers faced. My father was diagnosed with colorectal cancer in March 1980; after enduring surgery, chemotherapy and radiation, my dad fought valiantly for life. He died eight months later in November 1980. Fast forward thirty years later. I am two years cancer-free, or two years N.E.D., or no evidence of disease. Not cured, since there is no cure for cancer. I am acutely aware of this fact, but I do not give it much thought. Apart from some secondary side-effects, I feel happy and grateful  to be alive and live my life accordingly

Towards the end of the film, the words “she is gone” are said; these are as powerful as “she will live.” Or in my case, “I will live.”  Thankfully, these words are more common today than they were thirty years ago. I cannot over-emphasize the power of knowledge when applied and directed to medical research and the betterment of human lives. L’chaim. To life.

This is not to say or suggest in any way that humanity has achieved its final victory over cancer. No, cancer is still a dreadful and awful disease, but not as awful as it once was. It is also true that not all cancers are treatable; I read about the case of Oliver Sacks, the eminent neurologist and writer, who has a terminal form of cancer that has metastasized to the liver; it is inoperable, untreatable.  In The New York Times (“My Own Life”: February19, 2015), Sacks, who is 81, writes:
I cannot pretend I am without fear. But my predominant feeling is one of gratitude. I have loved and been loved; I have been given much and I have given something in return; I have read and traveled and thought and written. I have had an intercourse with the world, the special intercourse of writers and readers.
Such are my sentiments, but to a lesser degree than Dr. Sacks, in relation to my talents, my achievements, and my age. I also hope to (continue to) give something in return for what I have been given. To live the good life; and to establish a legacy for my children. Such is the human spirit in action.

Tuesday, February 17, 2015

Meet The Coywolf

Nature




This is from David Suzuki’s The Nature of Things (cbc tv), a science documentary that I started watching in the early 1970s; this episode looks at a hybrid species called the coywolf, which is part coyote, part wolf, and finds a home in urban settings, including here in Toronto. It originally aired on the cbc on July 31, 2014.

Sunday, February 15, 2015

The Mid-Winter Blues of February 2015

Blues

A View Outside My Living-Room Window:
Photo Credit: Perry J. Greenbaum, 2015


A cold but sunny day here in Toronto with clear blue skies; it’s a deceiving picture when viewed indoors. Outside, it’s minus 25C with wind chill of minus 40C, Environment Canada, our national weather service, reports. How many more days till Spring, you say? Astronomical Spring begins with the vernal equinox on March 20th at 6:45 p.m. EDT, or 22;45 UT. Real spring in these parts always begins around April 15th. It’s a good day to stay indoors with a mug of Perry’s home-made hot chocolate.

Saturday, February 14, 2015

Thinking & Writing Like Edmund Burke

Political Thought

Stable Governments are best, Edmund Baurke says: "Property with peace and order; with civil and social manners . . . are good things too; and, without them, liberty is not a benefit whilst it lasts, and is not likely to continue long. The effect of liberty is to [let] individuals . . . do what they please: we ought to see what it will please them to do, before we risque congratulations, which may be soon turned into complaints," 
as cited in The Intellectual Life of Edmund Burke: From the Sublime and Beautiful to American Independence, by David Bromwich.
Image Credit & Source: Barnes & Noble

In a book review article, on David Bromwich’s The Intellectual Life of Edmund Burke: From the Sublime and Beautiful to American Independence, Ian Hampsher-Monk writes in Foreign Affairs that if Burke seemed inconsistent, it was for a valid reason. Or perhaps Burke was consistent in the one area that mattered most: he was not a radical. What mattered most was stability.

Hampsher-Monk writes:
Edmund Burke, the eighteenth-century British politician and writer, is today best known for Reflections on the Revolution in France, published in 1790. In it, Burke denounced the revolutionaries in France and their supporters in Great Britain for what he considered their misplaced faith in principles such as “abstract liberty” and “the rights of men” and for their rejection of more pragmatic, procedural paths to ending the tyranny of hereditary monarchy. As Burke put it: 


“Property with peace and order; with civil and social manners . . . are good things too; and, without them, liberty is not a benefit whilst it lasts, and is not likely to continue long. The effect of liberty is to [let] individuals . . . do what they please: we ought to see what it will please them to do, before we risque congratulations, which may be soon turned into complaints."

In a sense, the debate between Burke and his antagonists—between conservatism and radicalism, broadly defined—has shaped political debate in the Western world ever since, and Burke himself has become known as “the father of modern conservatism.”



David Bromwich is not fond of that phrase. “No serious historian today would repeat the commonplace that Burke was the father of modern conservatism,” writes the esteemed scholar of literature in his magnificent, beautifully written new study of the first half of Burke’s career, which is the most notable addition to a recent crop of books about Burke. The trouble is not only that the line between Burke and modern conservatism is hardly straight but also that Burke’s legacy is far too complex to be captured by any such phrase. Part of the problem, as Bromwich makes clear, are the tensions (and even paradoxes) within Burke’s own thinking and writing.

His condemnation of the French Revolution was preceded by his sympathy for the American one that took place two decades earlier. This gave ammunition to his radical foes, such as the critic William Hazlitt, who later wrote that by rejecting the French Revolution, Burke “abandoned not only all his practical conclusions, but all the principles on which they were founded. He proscribed all his former sentiments, denounced all his former friends, [and] rejected and reviled all the maxims to which he had formerly appealed as incontestable.”
If you have read Burke, you would agree, I think. But I do not see this as a fault, but as a moral necessity for a writer and thinker. That Burke favoured the American Revolution but not the French Revolution is easy to comprehend once you delve, even a little, into how both came about and what each achieved. For example, the Reign of Terror (1793-94), and in particular the public executions by guillotine of thousands, cast a long shadow on the revolutionary ideals. That the American ideal has lost currency today, and has been replaced by other ideas, does not take away its original great achievement.

In that sweeping historical and cultural narrative, From Dawn to Decadence: 500 Years of Western Cultural Life (2000), Jacques Barzun says the following about Burke and his view on stability in government:
The greatest political thinker of the late 19thC. Edmund Burke, had demonstrated that stable governments depend not on force, but on habit—the ingrained, far from stupid obedience to the laws of ways of the country as they have been and are. It follows that to replace by fiat new sets of forms with another, thought up by some improver, no matter how intelligent, ends up in disaster. To expect such a scheme to prosper is unreasonable, because habits do not form overnight. (520-21)
When you consider this, and look at places where one nation has attempted to force new habits on another (so-called nation-building), it has ended up in disaster; the middle east, of course, easily comes to mind as one not-so-shining example. In this, at least, I sense that Burke had it right.

*************************
For more, go to [ForeignAffairs].

Tuesday, February 10, 2015

The Facts On Measles

Human Health


The Anti-Vaccine Lobby is not new, and is not above giving false information to engender
fear and paranoia in the public's mind. In this cartoon from June 12 1802, the British satirist
James Gillray caricatured a scene at the Smallpox and Inoculation Hospital at St. Pancras,
showing Edward Jenner administering cowpox vaccine to frightened young women, and cows
emerging from different parts of people’s bodies. The cartoon was inspired by the controversy
over inoculating against the dreaded disease, smallpox. The inoculation agent, cowpox vaccine,
was rumored to have the ability to sprout cow-like appendages. At least this image is amusing,
perhaps because its premise is so preposterous.

Source
: U.S. Library of Congress Prints and Photograph Division: Washington

On Friday, February 6th, we received a letter from Toronto Public Health, which was sent to us from our two children’s schools; our two boys attend public schools (our oldest is in middle school in Grade 7, and our youngest is in elementary school, in Grade 1).

The letter, from Dr. Barbara Yaffe, director of communicable disease control and associate medical officer of health with Toronto Public Health, begins as follows:
Toronto Public Health is investigating a measles outbreak in the City of Toronto. Vaccination is the best defense against measles infection; and two doses of measles-containing vaccine (MMR or MMRV) are required for full protection.
Most Toronto schools have high vaccination rates.  However, if a measles exposure occurs in a school, students with incomplete vaccination, or an exemption from receiving the vaccine, will not be allowed to attend school until the outbreak is over.

If you are unsure of your child's immunization status, please check your child's yellow immunization card, or speak with your health care provider.  Once children are completely vaccinated they can return to school.
This is good public policy. There have been six confirmed cases of measles in Toronto; two children under the age of two and four adults. The individuals are unrelated, which causes some concern in public-health officials. And understandably so, since it is likely that there are more cases of measles in the city.

On social-media sites there has been a lot of debate and discussion on measles, a disease that has made a comeback in recent years in developed and advanced nations. What is not in doubt is that there are a cohort of persons, who, for various reasons, are against science and evidence-based medicine. They are part of the counter-culture movement, which is part of the larger counter-Enlightenment (anti-rational) group that views science and authority with suspicion and doubt.

Information is gathered and discarded based on its emotional appeal and whether others of similar minds are in agreement. They are found on both the right and left of the political spectrum, and in both secular and religious camps. What they have in common is that the scientific method is neither applied nor understood. This is a growing concern. A few days ago, I posted an article, citing National Geographic (“Why Do So Many Reasonable People Doubt Science? by Joel Achenbach), on science doubters.

A subset of this group are the anti-vaccine movement, or anti-vaxxers—militant and ill-informed— while also certain and firm in their decisions. They might have “good intentions,” and this is in doubt, but they are, however, responsible for the increase in incidences of measles; there is no other or softer way to present this information. A lack of scientific knowledge combined with magical thinking leads to such faulty and dangerous decision-making. Moreover, society is placed at risk because of it.

For those interested in facts, here is what the World Health Organization reports on measles; you will note a correlation between increased global vaccinations of children and decreased mortality. Even so, despite the availability of vaccines, 145,700 people, chiefly children, died from contacting measles.

On its website, the WHO reports the following relevant information in Fact sheet no. 286:
Measles is a highly contagious, serious disease caused by a virus. In 1980, before widespread vaccination, measles caused an estimated 2.6 million deaths each year.

The disease remains one of the leading causes of death among young children globally, despite the availability of a safe and effective vaccine. Approximately 145 700 people died from measles in 2013 – mostly children under the age of 5.

Measles is caused by a virus in the paramyxovirus family and it is normally passed through direct contact and through the air. The virus infects the mucous membranes, then spreads throughout the body. Measles is a human disease and is not known to occur in animals.

Accelerated immunization activities have had a major impact on reducing measles deaths. During 2000-2013, measles vaccination prevented an estimated 15.6 million deaths. Global measles deaths have decreased by 75% from an estimated 544 200 in 2000 to 145 700 in 2013.
You do not need be a scientist or need a graduate science degree to understand these numbers, and what words like “correlation” and “causation” mean. It helps though if you have confidence in science (and not deny its validity and its efficacy), and admit openly on how medical advances, like vaccines, have improved our lives.

The measles vaccine, like all great discoveries, is an achievement in itself; it was developed by John Enders at Boston Children’s Hospital in 1963; and improved by Dr. Maurice Hilleman of Merck in 1971 to incorporate three vaccines into one dose— for measles, mumps and rubella (MMR). By combining the three into a single dose, six vaccines were reduced to two. Dr. Hilleman, The New York Times reports, “devised or substantially improved more than 25 vaccines, including 9 of the 14 now routinely recommended for children.”

Vaccinations are not only a great scientific advancement in human health, they are also a great public health benefit. My wife (a nurse) and I advocate and are a proponent for childhood vaccinations, and accordingly have vaccinated all of our children in accordance with the schedule set out by knowledgeable public-health agencies like Public Health Agency of Canada, the Canadian Paediatric Society and Ontario’s Ministry of Health. As a parent I cannot overemphasize to other parents to ensure that their (your) children are vaccinated. It’s the smart and reasonable thing to do. Or if you consider fashion important, it’s the fashionable thing to do.

********************
For more, go to [WHO]

Sunday, February 8, 2015

Doubting Science

Communicating Science

Old Conflicts: “In 1925 in Dayton, Tennessee, where John Scopes was standing trial for teaching evolution in high school, a creationist bookseller hawked his wares. Modern biology makes no sense without the concept of evolution, but religious activists in the United States continue to demand that creationism be taught as an alternative in biology class. When science conflicts with a person’s core beliefs, it usually loses.”
Photo Credit:
Bettman/Corbis
Source
: NatGeo


An article, by Joel Achenbach, in National Geographic raises the important question of why so many Americans doubt science; there are not only climate-change deniers (only 40 percent of Americans agree that human activity is the chief reason for climate change), there are also anti-vaxxers and anti-evolution individuals whose chief motivation, it seems, is to question everything that emanates from science, or at least those ideas or parts with which they do not find agreement.

Questioning and disagreement is part of human activity, but this goes beyond the questioning to gather facts so as to arrive at an informed decision. There is something else at play here, which has everything to do with our human brain’s need to make sense of uncertainty and randomness. Change can be difficult. The fast-changing world of technology and great advances in science and medicine can be confusing and not comforting to many. It can be highly disquieting.

Achenbach writes:
We live in an age when all manner of scientific knowledge—from the safety of fluoride and vaccines to the reality of climate change—faces organized and often furious opposition. Empowered by their own sources of information and their own interpretations of research, doubters have declared war on the consensus of experts. There are so many of these controversies these days, you’d think a diabolical agency had put something in the water to make people argumentative. And there’s so much talk about the trend these days—in books, articles, and academic conferences—that science doubt itself has become a pop-culture meme. In the recent movie Interstellar, set in a futuristic, downtrodden America where NASA has been forced into hiding, school textbooks say the Apollo moon landings were faked.

In a sense all this is not surprising. Our lives are permeated by science and technology as never before. For many of us this new world is wondrous, comfortable, and rich in rewards—but also more complicated and sometimes unnerving. We now face risks we can’t easily analyze.

We’re asked to accept, for example, that it’s safe to eat food containing genetically modified organisms (GMOs) because, the experts point out, there’s no evidence that it isn’t and no reason to believe that altering genes precisely in a lab is more dangerous than altering them wholesale through traditional breeding. But to some people the very idea of transferring genes between species conjures up mad scientists running amok—and so, two centuries after Mary Shelley wrote Frankenstein, they talk about Frankenfood.

The world crackles with real and imaginary hazards, and distinguishing the former from the latter isn’t easy. Should we be afraid that the Ebola virus, which is spread only by direct contact with bodily fluids, will mutate into an airborne superplague? The scientific consensus says that’s extremely unlikely: No virus has ever been observed to completely change its mode of transmission in humans, and there’s zero evidence that the latest strain of Ebola is any different. But type “airborne Ebola” into an Internet search engine, and you’ll enter a dystopia where this virus has almost supernatural powers, including the power to kill us all.
In some cases,this anti-science view emanates from the old and continuing conflict between religion or rather, faith and science, which cannot mutually co-exist in the same mind and body. When core or fundamentalist beliefs conflict with science, beliefs win and science loses. While some are deeply informed by religion; others by a combative or conspiratorial frame of reference, which doubts authority and facts—this is an counter-culture view that first found favour during the Romantic era as a counterweight against cold rationalism and the European Enlightenment. As I and others have argued elsewhere, while emotions and feelings are suitable and wonderful for the arts, they have no place in science. At the core of science is the tried and true Scientific Method.

No amount of evidence will convince the romantics; and the Internet makes it easier for such people to find like-minded individuals who “feel the same way.” And such people become the “experts” and sources of information. Yes, it's emotional, which is not how science works. Most deniers, doubters and truthers, as they are called, are not scientifically literate, which explains much, but not all.

That they do societal harm, as is certainly the case of anti-vaxxers, is undeniable. But such folks think, or at least feel that their questioning of authority, of the official position, is actually the right thing to do. They have certainty on their side (while science is often uncertain and provisional), and in lieu of that anger and rage to fuel this certainty. That they can be wrong does not enter their minds, so certain they are of their position and views.  This, of course, is the opposite of an intellectual scientific mind, which on the basis of evidence, can and does change views.

Science, if anything, is self-correcting; and this is its strength.

**********************
For more, go to [NatGeo]

Saturday, February 7, 2015

Google Working On Cancer-Detection Wristband

Innovative Medicine

Andrew Conrad (left), molecular biologist, at Google X: “So imagine that you swallow a pill [You would take a pill maybe twice a month] and that pill has small things called nanoparticles in it, decorated on their surface with markers that attach to cancer cells, We have them circulate through your whole body, and we collect them in the vasculature of the arm with a magnet, and you ask them what they saw.”
Photo Credit: Google
Source: BBC

In an article in Physics.org, Nancy Owano writes about Google’s plans to refashion medicine to make it both more personal and proactive; one of its most noteworthy research projects is designing a wristband that would be an early detector of cancer. The project leader is Andrew Conrad, a molecular biologist, who joined Google X, the company’s research unit, as head of Life Sciences in 2013.

Owano  writes:
In brief, Google is designing a system where tiny magnetic particles patrol the human body for signs of cancer and other diseases. UPI's Brooks Hays said that "the pill would release nanoparticles into a patient's bloodstream; the magnetized particles would tour the body seeking out cancer cells to bind to. A wearable monitor would attract and count the particles, pulling information as to what the particles had detected." Cancer cells, for example, would light up. How does light pass through skin? To understand that, Google started to make synthetic skin.

For their arm model, they had to use materials that behave like skin with biocomponents of real arms. Also, Google is monitoring 175 healthy volunteers, collecting physiological data frequently,. The goal is to understand what defines a "healthy" person, to know what 'normal' is. They need to understand the baseline. In the video, Conrad, had a memorable reply when his interviewer asked if some people would feel weird having nanoparticles floating through their body as trackers. "It's way weirder," said Conrad, "to have cancer cells floating through your body that are constantly trying to kill you."
How true. When you think about it for a minute or so, you tend to agree with Conrad; this is the kind of innovative and advanced thinking that leads to the betterment of human life. Most humans want to live not only a long life, but also a healthy one free of disease and debilitating illness. The trick, so to speak, is to catch disease early, when it can do the least damage. After all, early detection (and diagnosis), along with prevention and precise treatment, form the three components of modern medicine.

***************
For more, go to [Physics.org]

Friday, February 6, 2015

How To Punish ISIS

State Of The World
The best way to fight evil is to fight evil, and declare it so. ISIS is evil, and if there are now any doubts, the burning of a Jordanian prisoner of war should put any doubts to rest. Now is not the time to be silent, and all freedom-loving nations of the world ought to unite to defeat this beast. Silence has never worked; it only emboldens the brutal and the barbaric. Action is required, and Prof. George Jochnowitz writes about a particular action that will help defeat ISIS.



***********************************
by George Jochnowitz
President Obama should invite Binyamin Netanyahu to the White House and make a statement saying that the United States needs to benefit and honor Israel as a way to fight terrorism. He should say, “We are inviting Israel’s Prime Minister in order to punish you, ISIS, for burning an innocent man to death. Your actions have made us understand that we must fight you by joining with Israel. Whenever there is an act of terrorism in the world, we will respond by granting Israel a diplomatic or military favor.”

Leaders of the world everywhere would scream in horror at this action. But they did the same thing when Israel announced it would withdraw from Gaza. Even before the withdrawal took place, on July 9, 2005, 171 non-governmental organizations voted to boycott, divest from, and impose sanctions against Israel. That was the beginning of the BDS Movement.

A thousand years ago, Islam was open to rationality and science. There were Muslim scientists who made important discoveries. In 1006, as Professor Bernard Goldstein has shown, `Ali b. Ridwan described a supernova and located it with great precision in the constellation Lupus. Contemporary scientists have checked the information out and found it to be accurate.

Today, on the other hand, the energies—intellectual and otherwise—of the Islamic world are devoted to irrational hatred. The primary cause is hatred against Israel, but when hatred exists, it lands on other targets as well. ISIS has not directly attacked Israel at all—its victims have been Yazidis, Shiites, and now an innocent Jordanian who was burned alive. These targets are being attacked because hatred knows no bounds. 
What is happening in the Middle East is analogous to what happened to the Roma and Sinti peoples, formerly known as Gypsies. The Nazis hated them, to be sure, but there would not have been an attempt to exterminate them if the Nazis hadn’t already gotten into the practice of exterminating minorities because they were trying to rid the world of Jewish genes. One hatred leads to another.

Thomas L. Friedman wrote an Op-ed piece, in the New York Times, telling Binyamin Netanyahu not to get involved in the controversy concerning a treaty with Iran. “Just lie low, Mr. Netanyahu. Don’t play in our politics.”

Friedman’s column reminds one of the New York Times and its silence during World War II, and of America’s very slow response to the Holocaust. Both the Times and the Roosevelt administration were afraid that making an obvious effort to rescue Jews from the death camps would lead the world to believe that American involvement in World War II was simply the result of Jewish pressure. Independent of the fact that anti-Semitism was a force that was motivating Hitler, the United States needed to defend itself against the Axis powers. Independent of the fact that anti-Zionism is driving Muslim extremists crazy, the United States needs to defend itself against terrorists. 
Showing extremists that they are helping Israel by their vicious acts is a way for America to fight these fanatics.
*********************************** 
George Jochnowitz was born in New York City, in 1937. He became aware of different regional pronunciations when he was six, and he could consciously switch accents as a child. He got his Ph.D. in linguistics from Columbia University and taught linguistics at the College of Staten Island, CUNY. His area of specialization was Jewish languages, in particular, Judeo-Italian dialects. As part of a faculty-exchange agreement with Hebei University in Baoding, China, he was in China during the Tiananmen Massacre. He can be reached at george@jochnowitz.net.

***********************************
Copyright ©2015. George Jochnowitz. All Rights Reserved. 

Thursday, February 5, 2015

The German Romantics

Emotion & Reason


Wanderer Above the Sea of Fog (1818) by Caspar David Friedrich
Image Source: Weekly Standard
 In a book review in The Weekly Standard, on Rüdiger Safransk’s Romanticism: A German Affair, Thomas A Kohut, says that this is an excellent book in tracing and in understanding the roots of German Romanticism. First, one ought to distinguish, however, between two words that sound similar: Romanticism and Romantics.

Kohut says:
 The author distinguishes between “Romanticism” and “Romantics.” The former was a circumscribed historical period beginning (in Safranski’s account) in 1769 with Johann Gottfried von Herder’s voyage from Riga and ending in the 1820s with E. T. A. Hoffmann and Joseph von Eichendorff. The Romantic era is the subject of the first half of the book. The “Romantics,” then, are the individual thinkers who carried on the tradition of Romanticism after the Romantic era was over: Romantics—down to the student rebels of the late 1960s—are the subject of the book’s second half. According to Safranski, the idea animating Romanticism from 1770 until the 1820s, and Romantics to the present day, is “that the beam of our awareness does not illuminate the entirety of our experience, that our consciousness cannot grasp our whole Being, that we have a more intimate connection with the life process than our reason would like to believe.”
This is a reaction to the European Enlightenment, which is still strong and present. This is a counter-Enlightenment idea.

Now, it must be said that we all carry or court some romantic notions (some obviously more revealed than others), that not every part of our being, even the most rational of us, is informed by reason alone. Romantics believe in the use of emotion, and of feelings, which is excellent for art and the creative endeavors such as painting, music, fiction writing and dance. The music of Mendelssohn, Schumann and Brahms is enjoyable and hauntingly beautiful; the writings of Wolfgang von Goethe, Friedrich von Schiller, and Friedrich Holderlin can inspire us with its transcendent language and its search for deeper meaning. The landscape paintings of Carl Blechen, Joseph Anton Koch, and Caspar David Friedrich can place us in a dreamy state with their muted visions of nature.

This romantic feeling becomes problematic, even dangerous, however, when applied to the political realm, which succinctly explains the mindset of revolutionaries and extremists. The reality is coloured by what could be, which in most cases (perhaps all) requires the use of violence to achieve; the finer sensibilities in art turn violent in the political arena. It must be noted that both Marx and Engels were influenced by Romanticism and were indeed 19th-century Romantics. Their desire was to force personal destiny onto others; their personal desires lead to tragedy.

This view thankfully places me in the company of both the book’s author and reviewer, the latter who writes:
The author is sympathetic to Romanticism, when it remains in the aesthetic realm, as having the potential to enrich and fulfill a life and a world that would otherwise be sterile and superficial, a literal life and a literal world. The problem comes when Romanticism enters the political realm. Whereas the Romantic craves adventure, intense experiences, and extremes, successful politics depends on compromise, rational discourse, consensus, and achievement that is mostly partial and prosaic.
I would recommend that you read the rest of this fine review as a brief introduction to the benefits and perils of taking on a Romantic view of life.

****************
For more, go to [WeeklyStandard]
 

Wednesday, February 4, 2015

World Cancer Day: Not Beyond Us

Human Health

World Cancer Day: This is what the site says about it, which sadly given cancer’s presence in people’s lives is somewhat lacking. It sounds as if it were written by a PR firm, and not by anyone directly involved in cancer research or treatment, let alone by anyone affected by cancer:
“World Cancer Day is a unique opportunity to raise awareness that there is much that can be done at an individual, community and governmental level, to harness and mobilise these solutions and catalyse positive change. By moving forward together we have the potential to show: Cancer. It is not beyond us.”
Source & Credit: WorldCancerDay


Today is World Cancer Day, which the self-named website of the Geneva-based Union for International Cancer Control (UICC) says is 
Taking place under the tagline ‘Not beyond us’, World Cancer Day 2015 will take a positive and proactive approach to the fight against cancer, highlighting that solutions do exist across the continuum of cancer, and that they are within our reach.
I understand the positive and proactive approach, but the language could be clearer, stronger, more direct. Is that the best that a global site dedicated to cancer can do or say? If so, they need to seriously rethink what they are doing, or at least how they present themselves to the public. I find this message disappointing, and find that it can be improved. Given my intimate experience and knowledge about cancer, here is what I would write, if only someone would ask me:
Turn knowledge into action. We need to fund more research on cancer. We are now at a critical point in our battle with cancer—a disease that knows no boundaries, a disease that is indiscriminate, a disease that affects millions of people globally—but more has to be done. This takes fundamental research, tapping into the minds of the world's greatest scientists to make it happen. This takes money. Turn knowledge into action. You can help make a difference in the lives of cancer patients and their families.
 This is better. Much better.

Tuesday, February 3, 2015

Growing Cheap & Efficient Solar Cells

Solar Technology


Perovskite Structure with a chemical formula ABX3. Wikipedia says; “The red spheres are X atoms (usually oxygens), the blue spheres are B-atoms (a smaller metal cation, such as Ti4+), and the green spheres are the A-atoms (a larger metal cation, such as Ca2+). Pictured is the undistorted cubic structure; the symmetry is lowered to orthorhombic, tetragonal or trigonal in many perovskites.”
Source: Wikipedia; original uploader: Cadmium

An article in Compound Semiconductor journal writes about perovskites, which are able to convert parts of the solar spectrum into electricity more efficiently than silicon, thus reducing the cost of solar power. The article ("Los Alamos Develops New Technique For Growing Perovskite Solar Cells"; 30th January 2015) says:
This week in the journal Science, Los Alamos National Laboratory researchers revealed a new solution-based hot-casting technique that allows growth of highly efficient and reproducible solar cells from large-area perovskite crystals.

"These perovskite crystals offer promising routes for developing low-cost, solar-based, clean global energy solutions for the future," said Aditya Mohite, the Los Alamos scientist leading the project.

State-of-the-art photovoltaics using high-purity, large-area, wafer-scale single-crystalline semiconductors grown by sophisticated, high temperature crystal-growth processes are seen as the future of efficient solar technology. Solar cells composed of organic-inorganic perovskites offer efficiencies approaching that of silicon, but they have been plagued with some important deficiencies limiting their commercial viability. It is this failure that the Los Alamos technique successfully corrects.
As a note of interest, Wikipedia says, “Perovskites take their name from the mineral, which was first discovered in the Ural mountains of Russia by Gustav Rose in 1839 and is named after Russian mineralogist L.A. Perovski (1792–1856).” Perovskites have the same structure as calcium titanium oxide, or CaTiO3. 

As a class of crystals, they have drawn the attention of energy scientists interested in the field of photovoltaics, “because of their low cost, high charge-carrier mobility, and long diffusion lengths,” says the engineering journal, IEEE Spectrum in a recent article on the subject. “In real world terms, this means that the electrons in perovskite-based photovoltaics can travel through thicker solar cells, which absorb more light and thereby generate more electricity than thinner cells.”

For a newer technology to displace an older established one it has to prove both cost-effective and efficient; now, solar technology is very close to achieving this and is in a good position to displace (and replace) fossil fuels. This is something to cheer about.

**************
For more, go to [CompoundSemiconductor]

Monday, February 2, 2015

The Future Of Graphene

Material Science




There is a long distance between discovery and application, between research and putting a material to profitable and continuous use. This is now the case with graphene, an article, by John Colapinto, in The New Yorker says. It traces the initial discovery, more than a decade ago, of Andre Konstantin Geim, a physicist at the University of Manchester, who was able to isolate single atoms of carbon, called graphene.

Colapinto writes:
On one such evening, in the fall of 2002, Geim was thinking about carbon. He specializes in microscopically thin materials, and he wondered how very thin layers of carbon might behave under certain experimental conditions. Graphite, which consists of stacks of atom-thick carbon layers, was an obvious material to work with, but the standard methods for isolating superthin samples would overheat the material, destroying it. So Geim had set one of his new Ph.D. students, Da Jiang, the task of trying to obtain as thin a sample as possible—perhaps a few hundred atomic layers—by polishing a one-inch graphite crystal. Several weeks later, Jiang delivered a speck of carbon in a petri dish. After looking at it under a microscope, Geim recalls, he asked him to try again; Jiang admitted that this was all that was left of the crystal. As Geim teasingly admonished him (“You polished a mountain to get a grain of sand?”), one of his senior fellows glanced at a ball of used Scotch tape in the wastebasket, its sticky side covered with a gray, slightly shiny film of graphite residue.
It would have been a familiar sight in labs around the world, where researchers routinely use tape to test the adhesive properties of experimental samples. The layers of carbon that make up graphite are weakly bonded (hence its adoption, in 1564, for pencils, which shed a visible trace when dragged across paper), so tape removes flakes of it readily. Geim placed a piece of the tape under the microscope and discovered that the graphite layers were thinner than any others he’d seen. By folding the tape, pressing the residue together and pulling it apart, he was able to peel the flakes down to still thinner layers.
Geim had isolated the first two-dimensional material ever discovered: an atom-thick layer of carbon, which appeared, under an atomic microscope, as a flat lattice of hexagons linked in a honeycomb pattern. Theoretical physicists had speculated about such a substance, calling it “graphene,” but had assumed that a single atomic layer could not be obtained at room temperature—that it would pull apart into microscopic balls. Instead, Geim saw, graphene remained in a single plane, developing ripples as the material stabilized.
Geim and Konstantin Novoselov won the Nobel Prize in Physics, in 2010, for this discovery. And, as is often the case with such discoveries, people were looking at ways to monetize it.
By then, the media were calling graphene “a wonder material,” a substance that, as the Guardian put it, “could change the world.” Academic researchers in physics, electrical engineering, medicine, chemistry, and other fields flocked to graphene, as did scientists at top electronics firms.
So far, graphene, as wondrous as it is, has eluded the best minds to put it to practical use. This does not in any way suggest that graphene can’t be put to practical use, but, rather, that the way to do so economically and easily has not yet been discovered. 

*************************
For more, go to [NewYorker]

Sunday, February 1, 2015

Talking To Animals

Inter-Species Dialogue

Animal-Human Communication: “Bagheera and Mowgli by the Detmold brothers from a 1903 edition of The Jungle Book by Rudyard Kipling.”
Photo Credit: Corbis

Source: Aeon 


Many of us are fascinated with animals, even those of us who currently do not reside with any non-human beings. One of the questions brought up is whether animals genuinely understand human speech, and, alternately, whether animals can communicate with humans. An article, by Stassa Edwards, in Aeon raises these questions, and more.

Edwards writes:
Whether or not animals have the ability to speak, or even have inner lives to speak of, is another question. We know that animals can communicate with one another – their hierarchies and intricate rituals have been quantified and observed. But humans are disposed to anthropomorphism. We tend to project our thoughts onto other species. We can’t help but infer consciousness when a cat purrs at a welcome touch or a dog betrays a guilty look when scolded for stealing food. But does your cat really have an inner life?

In his Apology for Raymond Sebond (1576), Michel de Montaigne ascribed animals’ silence to man’s own wilful arrogance. The French essayist argued that animals could speak, that they were in possession of rich consciousness, but that man wouldn’t condescend to listen. ‘It is through the vanity of the same imagination that [man] equates himself with God,’ Montaigne wrote, ‘that he attributes divine attributes for himself, picks himself out and separates himself from the crowd of other creatures.’ Montaigne asked: ‘When I play with my cat, who knows if she is making more of a pastime of me than I of her?’

Montaigne’s question is as playful as his cat. Apology is not meant to answer the age-old question, but rather to provoke; to tap into an unending inquiry about the reasoning of animals. Perhaps, Montaigne implies, we simply misunderstand the foreign language of animals, and the ignorance is not theirs, but ours.

Montaigne’s position was a radical one – the idea the animals could actually speak to humans was decidedly anti-anthropocentric – and when he looked around for like-minded thinkers, he found himself one solitary essayist. But if Montaigne was a 16th century loner, then he could appeal to the Classics. Apology is littered with references to Pliny and a particular appeal to Plato’s account of the Golden Age under Saturn. But even there, Montaigne had little to work with. Aristotle had argued that animals lacked logos (meaning, literally, ‘word’ but also ‘reason’) and, therefore, had no sense of the philosophical world inhabited and animated by humans. And a few decades after Montaigne, the French philosopher René Descartes delivered the final blow, arguing that the uniqueness of man stems from his ownership of reason, which animals are incapable of possessing, and which grants him dominion over them
.
Is this necessarily true? That humans’ possession of the faculty of reason make us the dominant creature on this planet? This is an age-old argument that informs the views of many, which leads to other important questions of stewardship and conservation. As to reason, no doubt I advocate for a reasonable mind when it comes to matters political and economic, and of course scientific.

So, while reason is important, and has a prominent place in our society, and while the application of reason has advanced us in so many ways, it might not be as important (the cat’s meow) in the area of communication and in understanding other species. It might actually act as a barrier, a wall of separation into understanding the inner worlds of other species. Yes, there is a fascination with trying to understand what other animals think, and opening a channel of communication would help this cause.

In this case, I might be on the same side of Montaigne; anyone who has resided with an animal (I grew up with cats) understands this intimately.

*****************
For more, go to [Aeon]