Tuesday, November 25, 2014

indict a ham sandwich


fivethirtyeight.com | A St. Louis County grand jury on Monday decided not to indict Ferguson, Missouri, police Officer Darren Wilson in the August killing of teenager Michael Brown. The decision wasn’t a surprise — leaks from the grand jury had led most observers to conclude an indictment was unlikely — but it was unusual. Grand juries nearly always decide to indict.
Or at least, they nearly always do so in cases that don’t involve police officers.
Former New York state Chief Judge Sol Wachtler famously remarked that a prosecutor could persuade a grand jury to “indict a ham sandwich.” The data suggests he was barely exaggerating: According to the Bureau of Justice Statistics, U.S. attorneys prosecuted 162,000 federal cases in 2010, the most recent year for which we have data. Grand juries declined to return an indictment in 11 of them.

embarrassing multi-level incompetence, or, conspicuously orchestrated peasant stampede?






WaPo |  In the three months since Brown’s killing we have come to learn that collective failures of leadership are just the tip of the iceberg of problems in Ferguson. My colleague Radley Balko reported extensively on how municipalities in St. Louis County, Mo., profit from poverty. “If you were tasked with designing a regional system of government guaranteed to produce racial conflict, anger, and resentment,” he wrote, “you’d be hard pressed to do better than St. Louis County.” The inherent mistrust of police, the grand jury process and the motives of elected and law enforcement officials that we have seen from blacks in Ferguson can be traced back to the Balko’s observation.

Monday, November 24, 2014

let's play nsa!


motherboard.vice |  Prior to the release of the ANT catalog, the last time the public had ever heard anything about retro-reflection technology being used in a surveillance device was in 1960. And the technology became such a sensation that it earned one of the most iconic nicknames of the Cold War.
On August 4, 1945, as World War II was winding down and new tensions with the Soviets were starting to wind up, Russian schoolchildren paid a visit to the American Ambassador in Moscow and bestowed upon him a token of good will: a Great Seal of the United States. The Ambassador hung it in his residential study.

There it hung until one day in 1952, when a British radio technician in Moscow, listening in on Russian air traffic, discovered something unexpected on one frequency: the sound of the British ambassador, loud and clear, along with other American-accented conversations. Thus began one of many exhaustive tear-downs of the embassy. They were looking to find a listening device—and they did, along with a new frontier of spying. The culprit was the Great Seal.

Inside the Americans and British found a tiny device the likes of which they’d never seen. So alien was the Great Seal Bug that the only appropriate name for it seemed to be “The Thing,” after the character in the Addams Family (which was then still just a New Yorker cartoon). It was a retroreflector.

“The Thing,” turned out to have been invented by the legendary Russian engineer Lev Sergeyevich Termen, or Leon Theremin, who may be most famous as the father of the spooky radio-based instrument named after him, but is also considered a pioneer of RFID technology.

But perhaps surprisingly, despite all the public interest in the revelation, “The Thing” did not seem to herald more “things.” In the history of espionage technology, it was a great story, but ultimately a footnote. As far as the public knew, after its fantastical discovery there were fifty-three years of radio silence, so to speak.

“In hindsight,” Ossmann said, “it’s obvious that these types of attacks are practical and employed. For someone who knows a little bit about electronics and a little bit about security, RF retroreflectors should be completely unsurprising. However, I couldn't find anyone who had published any research on the subject at all. That was astonishing."

(This is where things get a bit complicated again; it's worth it, but if you simply can't deal with the details, take my word for it, and skip down to the next section.)  Fist tap Arnach.

babes in toyland where cost is not an issue...,


spiegel |  When it comes to modern firewalls for corporate computer networks, the world's second largest network equipment manufacturer doesn't skimp on praising its own work. According to Juniper Networks' online PR copy, the company's products are "ideal" for protecting large companies and computing centers from unwanted access from outside. They claim the performance of the company's special computers is "unmatched" and their firewalls are the "best-in-class." Despite these assurances, though, there is one attacker none of these products can fend off -- the United States' National Security Agency

Specialists at the intelligence organization succeeded years ago in penetrating the company's digital firewalls. A document viewed by SPIEGEL resembling a product catalog reveals that an NSA division called ANT has burrowed its way into nearly all the security architecture made by the major players in the industry -- including American global market leader Cisco and its Chinese competitor Huawei, but also producers of mass-market goods, such as US computer-maker Dell. 

A 50-Page Catalog
These NSA agents, who specialize in secret back doors, are able to keep an eye on all levels of our digital lives -- from computing centers to individual computers, and from laptops to mobile phones. For nearly every lock, ANT seems to have a key in its toolbox. And no matter what walls companies erect, the NSA's specialists seem already to have gotten past them. 

This, at least, is the impression gained from flipping through the 50-page document. The list reads like a mail-order catalog, one from which other NSA employees can order technologies from the ANT division for tapping their targets' data. The catalog even lists the prices for these electronic break-in tools, with costs ranging from free to $250,000.

gen. michael hayden brought the elite hacknological bacon home to the usaf...,


spiegel |  The NSA's TAO hacking unit is considered to be the intelligence agency's top secret weapon. It maintains its own covert network, infiltrates computers around the world and even intercepts shipping deliveries to plant back doors in electronics ordered by those it is targeting.

It was thanks to the garage door opener episode that Texans learned just how far the NSA's work had encroached upon their daily lives. For quite some time now, the intelligence agency has maintained a branch with around 2,000 employees at Lackland Air Force Base, also in San Antonio. In 2005, the agency took over a former Sony computer chip plant in the western part of the city. A brisk pace of construction commenced inside this enormous compound. The acquisition of the former chip factory at Sony Place was part of a massive expansion the agency began after the events of Sept. 11, 2001.

On-Call Digital Plumbers
One of the two main buildings at the former plant has since housed a sophisticated NSA unit, one that has benefited the most from this expansion and has grown the fastest in recent years -- the Office of Tailored Access Operations, or TAO. This is the NSA's top operative unit -- something like a squad of plumbers that can be called in when normal access to a target is blocked.

According to internal NSA documents viewed by SPIEGEL, these on-call digital plumbers are involved in many sensitive operations conducted by American intelligence agencies. TAO's area of operations ranges from counterterrorism to cyber attacks to traditional espionage. The documents reveal just how diversified the tools at TAO's disposal have become -- and also how it exploits the technical weaknesses of the IT industry, from Microsoft to Cisco and Huawei, to carry out its discreet and efficient attacks.

The unit is "akin to the wunderkind of the US intelligence community," says Matthew Aid, a historian who specializes in the history of the NSA. "Getting the ungettable" is the NSA's own description of its duties. "It is not about the quantity produced but the quality of intelligence that is important," one former TAO chief wrote, describing her work in a document. The paper seen by SPIEGEL quotes the former unit head stating that TAO has contributed "some of the most significant intelligence our country has ever seen." The unit, it goes on, has "access to our very hardest targets."

A Unit Born of the Internet
Defining the future of her unit at the time, she wrote that TAO "needs to continue to grow and must lay the foundation for integrated Computer Network Operations," and that it must "support Computer Network Attacks as an integrated part of military operations." To succeed in this, she wrote, TAO would have to acquire "pervasive, persistent access on the global network." An internal description of TAO's responsibilities makes clear that aggressive attacks are an explicit part of the unit's tasks. In other words, the NSA's hackers have been given a government mandate for their work. During the middle part of the last decade, the special unit succeeded in gaining access to 258 targets in 89 countries -- nearly everywhere in the world. In 2010, it conducted 279 operations worldwide.

Indeed, TAO specialists have directly accessed the protected networks of democratically elected leaders of countries. They infiltrated networks of European telecommunications companies and gained access to and read mails sent over Blackberry's BES email servers, which until then were believed to be securely encrypted. Achieving this last goal required a "sustained TAO operation," one document states.

Sunday, November 23, 2014

aggregate intelligence


radiolab |   What happens when there is no leader? Starlings, bees, and ants manage just fine. In fact, they form staggeringly complicated societies -- all without a Toscanini to conduct them into harmony. This hour of Radiolab, we ask how this happens.

We gaze down at the bottom-up logic of cities, Google, and even our very own brains with fire-flyologists, ant experts, neurologists, a mathematician, and an economist.

Saturday, November 22, 2014

Órale buybull buddies - use culo-calming ointment if you go and see this movie...,




quantum solution to the arrow of time dilemma?


physorg |  Entropy can decrease, according to a new proposal - but the process would destroy any evidence of its existence, and erase any memory an observer might have of it. It sounds like the plot to a weird sci-fi movie, but the idea has recently been suggested by theoretical physicist Lorenzo Maccone, currently a visiting scientist at MIT, in an attempt to solve a longstanding paradox in physics.
 
The laws of physics, which describe everything from electricity to moving objects to energy conservation, are time-invariant. That is, the laws still hold if time is reversed. However, this time reversal symmetry is in direct contrast with everyday phenomena, where it’s obvious that time moves forward and not backward. For example, when milk is spilt, it can’t flow back up into the glass, and when pots are broken, their pieces can’t shatter back together. This irreversibility is formalized through the second law of thermodynamics, which says that entropy always increases or stays the same, but never decreases.

This contrast has created a reversibility paradox, also called Loschmidt’s paradox, which scientists have been trying to understand since Johann Loschmidt began considering the problem in 1876. Scientists have proposed many solutions to the conundrum, from trying to embed irreversibility in physical laws to postulating low-entropy initial states.

Maccone’s idea, published in a recent issue of , is a completely new approach to the paradox, based on the assumption that is valid at all scales. He theoretically shows that entropy can both increase and decrease, but that it must always increase for phenomena that leave a trail of information behind. Entropy can decrease for certain phenomena (when correlated with an observer), but these phenomena won’t leave any information of their having happened. For these situations, it’s like the phenomena never happened at all, since they leave no evidence. As Maccone explains, the second law of thermodynamics is then reduced to a mere tautology: physics cannot study processes where entropy has decreased, due to a complete absence of information. The solution allows for time-reversible phenomena to exist (in agreement with the laws of physics), but not be observable (in agreement with the second law of thermodynamics).

In his study, Maccone presents two thought experiments to illustrate this idea, followed by an analytical derivation. He describes two situations where entropy decreases and all records of it are permanently erased. In both scenarios, the entropy in the systems first increases and then decreases, but the decrease is accompanied by an erasure of any memory of its occurrence. The key to entropy decrease in the first place is a correlation between the observer and the phenomenon in question. As Maccone explains, when an interaction occurs between an observer and an observed phenomenon that decreases the entropy of the correlated observer-observed system, the interaction must also reduce their quantum mutual information. When this information is destroyed, the observer’s memory is destroyed along with it.

the arrow of time


informationphilosopher |  The laws of nature, except the second law of thermodynamics, are symmetric in time. Reversing the time in the dynamical equations of motion simply describes everything going backwards. The second law is different. Entropy must never decrease in time. 

Many natural processes are apparently irreversible. Irreversibility is intimately connected to the direction of time. Identifying the physical reasons for the observed irreversibility, the origin of irreversibility, would contribute greatly to understanding the apparent asymmetry of nature in time, despite nature's perfect symmetry in space. 

In 1927, Arthur Stanley Eddington coined the term "Arrow of Time" in his book The Nature of the Physical World. He connected "Time's Arrow" to the one-way direction of increasing entropy required by the second law of thermodynamics. This is now known as the "thermodynamic arrow."
In his later work, Eddington identified a "cosmological arrow," the direction in which the universe is expanding, as shown by Edwin Hubble about the time Eddington first defined the thermodynamic arrow.
There are now at least five other proposed arrows of time (discussed below). We can ask whether one arrow is a "master arrow" that all the others are following, or perhaps time itself is just a given property of nature that is otherwise irreducible to something more basic, as is space. 

Given the four-dimensional space-time picture of special relativity, and given that the laws of nature are symmetric in space, we may expect the laws to be invariant under a change in time direction. The laws do not depend on position in space or direction, they are invariant under translations and rotations, space is assumed uniform and isotropic. But time is not just another spatial dimension. It enters into calculations of event separations as an imaginary term (multiplied by the square root of minus 1). Nevertheless, all the dynamical laws of motion are symmetric under time reversal. 

So the basic problem is - how can macroscopic irreversibility result from microscopic processes that are fundamentally reversible? 


Friday, November 21, 2014

struggley looking the wrong way for the "infinitely great and "incorporeal" intelligence"...,


NYTimes |  Bloggers have noticed the religious symbols in the movie. There are those 12 apostles, and there’s a Noah’s ark. There is a fallen angel named Dr. Mann who turns satanic in an inverse Garden of Eden. The space project is named Lazarus. The heroine saves the world at age 33. There’s an infinitely greater and incorporeal intelligence offering merciful salvation.

But this isn’t an explicitly religious movie. “Interstellar” is important because amid all the culture wars between science and faith and science and the humanities, the movie illustrates the real symbiosis between these realms.

More, it shows how modern science is influencing culture. People have always bent their worldviews around the latest scientific advances. After Newton, philosophers conceived a clockwork universe. Individuals were seen as cogs in a big machine and could be slotted into vast bureaucratic systems.

But in the era of quantum entanglement and relativity, everything looks emergent and interconnected. Life looks less like a machine and more like endlessly complex patterns of waves and particles. Vast social engineering projects look less promising, because of the complexity, but webs of loving and meaningful relationships can do amazing good.

As the poet Christian Wiman wrote in his masterpiece, “My Bright Abyss,” “If quantum entanglement is true, if related particles react in similar or opposite ways even when separated by tremendous distances, then it is obvious that the whole world is alive and communicating in ways we do not fully understand. And we are part of that life, part of that communication. ...”

I suspect “Interstellar” will leave many people with a radical openness to strange truth just below and above the realm of the everyday. That makes it something of a cultural event.

Thursday, November 20, 2014

is "second-order science" any kind of science at all?


constructivism |   Context: The journal Constructivist Foundations celebrates ten years of publishing articles on constructivist approaches, in particular radical constructivism. Problem: In order to preserve the sustainability of radical constructivism and regain its appeal to new generations of researchers, we set up a new course of action for and with the radical constructivist community to study its innovative potential. This new avenue is “second-order science.” Method: We specify two motivations of second-order science, i.e., the inclusion of the observer, and self-reflexivity that allows second-order science to operate on the products of normal or first-order science. Also, we present a short overview of the contributions that we have collected for this inaugural issue on second-order science. Results: These six initial contributions demonstrate the potential of the new set of approaches to second-order science across several disciplines. Implications: Second-order science is believed to be a cogent concept in the evolution of science, leading to a new wave of innovations, novel experiments and a much closer relationship with current research in the cognitive neurosciences in particular, and with evolutionary and complexity theories in general.

wikipedia |  One version of social constructivism contends that categories of knowledge and reality are actively created by social relationships and interactions. These interactions also alter the way in which scientific episteme is organized.

Social activity presupposes human beings inhabiting shared forms of life, and in the case of social construction, utilizing semiotic resources (meaning making and meaning signifying) with reference to social structures and institutions. Several traditions use the term Social Constructivism: psychology (after Lev Vygotsky), sociology (after Peter Berger and Thomas Luckmann, themselves influenced by Alfred Schütz), sociology of knowledge (David Bloor), sociology of mathematics (Sal Restivo), philosophy of mathematics (Paul Ernest). Ludwig Wittgenstein's later philosophy can be seen as a foundation for Social Constructivism, with its key theoretical concepts of language games embedded in forms of life.

Constructivism in philosophy of science
Thomas Kuhn argued that changes in scientists' views of reality not only contain subjective elements, but result from group dynamics, "revolutions" in scientific practice and changes in "paradigms".[3] As an example, Kuhn suggested that the Sun-centric Copernican "revolution" replaced the Earth-centric views of Ptolemy not because of empirical failures, but because of a new "paradigm" that exerted control over what scientists felt to be the more fruitful way to pursue their goals.
"But paradigm debates are not really about relative problem-solving ability, though for good reasons they are usually couched in those terms. Instead, the issue is which paradigm should in future guide research on problems many of which neither competitor can yet claim to resolve completely. [A decision is called for] and in the circumstances that decision must be based less on past achievement than on future promise."
—Thomas Kuhn, The Structure of Scientific Revolutions . p. 157
The view of reality as accessible only through models was called model-dependent realism by Stephen Hawking and Leonard Mlodinow.[4] While not rejecting the idea of "reality-as-it-is-in-itself", model-dependent realism suggests that we cannot know "reality-as-it-is-in-itself", but only an approximation of it provided by the intermediary of models.[5] These models evolve over time as guided by scientific inspiration and experiment.

In the field of the social sciences, constructivism as an epistemology urges that researchers reflect upon the paradigms that may be underpinning their research, and in the light of this that they become more open to consider other ways of interpreting any results of the research. Furthermore, the focus is on presenting results as negotiable constructs rather than as models that aim to "represent" social realities more or less accurately. Norma Romm in her book Accountability in Social Research (2001) argues that social researchers can earn trust from participants and wider audiences insofar as they adopt this orientation and invite inputs from others regarding their inquiry practices and the results thereof.

the latent nature of global information warfare


springer |  Let us return to the nature of information warfare. In the past, war has always and only been real, in the system + model sense, like the bed in which you sleep and the apple you eat. The hard facts of war were inevitably accompanied by their informational shadows: the human shouting, the smell of horses, the sounds of trumpets in battles, the rhythm of machineguns, the pitched whistles of bombs falling from the sky, the smell of napalm, the marks left by the tanks’ tracks. For a short time, in the eighties, passive mass media and digital consumerism made us mistakenly think that war could be experienced by the public as virtual: a televised or computerized game, involving only representations to which nothing corresponded, like shadows without objects, simulacra in Baudrillard’s terminology. Thus, in 1991,4 Baudrillard argued in The Gulf War Did Not Take Place that the hi-tech fighting on the American side during the first Gulf War had transformed a conflict into propaganda and mass-mediated experience. The analysis was correct both in perceiving a difference and in identifying that difference in the decoupling between the system and the model. But it was wrong in selecting models as the new battlefields. Global information warfare is not virtual. It is mostly latent, that is, it is in the world but not experienced as part of the world. It is a war without shadows. You cannot see it, and cannot hear it, it silently happens everyday, can hit anyone anywhere, and we can all be its unaware victims. Take for instance distributed denial-of-service attacks. According to Arbor Networks, more than 2,000 of DDoS occur worldwide every day.5 Their number is increasing and more and more countries are involved that are not officially at war with each other. Similar attacks are very cheap. According to TrendMicro Research a week-long DDoS attack, capable of taking a small organization offline, can cost as little as $150 in the underground market. This is just an example. Conflicts in the infosphere—not just DDoS attacks, but also trade wars, currency wars, patent wars, marketing wars, and other silent forms of informational battles to win hearts, minds, and wallets—are increasingly neither real nor virtual, but latent to most of their victims. They are nonetheless dangerous and wasteful. They require special interfaces to be perceived. They will require a special sensitivity to be eradicated.

big data and their epistemological challenge


springer |  It is estimated that humanity accumulated 180 EB of data between the invention of writing and 2006. Between 2006 and 2011, the total grew ten times and reached 1,600 EB. This figure is now expected to grow fourfold approximately every 3 years. Every day, enough new data are being generated to fill all US libraries eight times over. As a result, there is much talk about “big data”. This special issue on “Evolution, Genetic Engineering and Human Enhancement”, for example, would have been inconceivable in an age of “small data”, simply because genetics is one of the data-greediest sciences around. This is why, in the USA, the National Institutes of Health (NIH) and the National Science Foundation (NSF) have identified big data as a programme focus. One of the main NSF–NIH interagency initiatives addresses the need for core techniques and technologies for advancing big data science and engineering (see NSF-12-499).
Despite the importance of the phenomenon, it is unclear what exactly the term “big data” means and hence refers to. The aforementioned document specifies that: “The phrase ‘big data’ in this solicitation refers to large, diverse, complex, longitudinal, and/or distributed data sets generated from instruments, sensors, Internet transactions, email, video, click streams, and/or all other digital sources available today and in the future.” You do not need to be an analytic philosopher to find this both obscure and vague. Wikipedia, for once, is also unhelpful. Not because the relevant entry is unreliable, but because it reports the common definition, which is unsatisfactory: “data sets so large and complex that they become awkward to work with using on-hand database management tools”. Apart from the circular problem of defining “big” with “large”, the definition suggests that data are too big or large only in relation to our current computational power. This is misleading. Of course, “big”, as many other terms, is a relational predicate: a pair of shoes is too big for you, but fine for me. It is also trivial to acknowledge that we tend to evaluate things non-relationally, in this case as absolutely big, whenever the frame of reference is obvious enough to be left implicit. A horse is a big animal, no matter what whales may think. Yet, these two simple points may give the impression that there is no real trouble with “big data” being a loosely defined term referring to the fact that our current computers cannot handle so many gazillions of data efficiently. And this is where two confusions seem to creep in. First, that the epistemological problem with big data is that there is too much of them (the ethical problem concerns how we use them; see below). And second, that the technological solution to the epistemological problem is more and better techniques and technologies, which will “shrink” big data back to a manageable size. The epistemological problem is different, and it requires an equally epistemological solution.

it knows if you've been bad or good, so be good for goodness sake!


WaPo |  According to Google, I am a woman between the ages of 25 and 34 who speaks English as her primary language and has accumulated an unwieldy 74,486 e-mails in her life. I like cooking, dictionaries and Washington, D.C. I own a Mac computer that I last accessed at 10:04 p.m. last night, at which time I had 46 open Chrome tabs. And of the thousands and thousands of YouTube videos I have watched in my lifetime, a truly embarrassing number of them concern (a) funny pets or (b) Taylor Swift.

I didn’t tell Google any of these things intentionally, of course — I didn’t fill out a profile or enter a form. But even as you search Google, it turns out, Google is also searching you.

This isn’t exactly new news. Google has, since 2009, published a transparency tool called Dashboard, which lets users see exactly what kind of data the Internet giant has on them and from which services. But the issue of data collection has provoked renewed anxiety of late, perhaps spurred by recent investigations into personal data and search engines in Europe and Asia — as well as the high-profile hacking of celebrities’ personal data and the shadow of last year’s National Security Agency revelations.

According to a recent survey by the consumer research firm Survata, people care more about Google accessing their personal electronic data than they do the NSA, their boss, their parents, or their spouse. Which is unfortunate, given that your parents and boss will probably never see everything you search, e-mail and click — while Google logs that material more or less all the time.

“Google knows quite a lot,” said Ondrej Prostrednik, the author of a recent Medium post about Google data collection that has begun making the Reddit rounds. “People outside of Google can only guess. But it is important to realize that we are the ones giving it all the data they know.”

Wednesday, November 19, 2014

ethical heroism in the face of institutional religion and self-righteous clericalism...,


WaPo |  As a Protestant, I have no particular insight into the internal theological debates of Catholicism. But the participants seem to inhabit different universes. One side (understandably) wants to shore up the certainties of an institution under siege. Francis begins from a different point: a pastoral passion to meet people where they are — to recognize some good, even in their brokenness, and to call them to something better. That something better is not membership in a stable institution, or even the comforts of ethical religion; it is a relationship with Jesus, from which all else follows.

Instead of being a participant in a cultural battle, Francis says, “I see the church as a field hospital after battle.” First you sew up the suffering (which, incidentally, includes all of us). “Then we can talk about everything else. Heal the wounds.” The temptation, in his view, is to turn faith into ideology. “The faith passes,” he recently said, “through a distiller and becomes ideology. And ideology does not beckon [people]. In ideologies there is not Jesus; in his tenderness, his love, his meekness. And ideologies are rigid, always. . . . The knowledge of Jesus is transformed into an ideological and also moralistic knowledge, because these close the door with many requirements.” 

The message seems simple. It actually highlights a complexity at the heart of Christianity: Its founder coupled a call for ethical heroism (don’t even lust in your heart) with a disdain for institutional religion and self-righteous clericalism. And this has been disorienting to institutionalists from the start. 

Francis has devoted serious attention to reforming the institutional expression — particularly the finances — of the Catholic Church. But he has chosen to emphasize the most subversive and challenging aspects of Christian faith. He really does view rigidity, clericalism and hypocrisy as just as (or more) damaging as sexual matters. Liberals want to incorporate this into their agenda. But the pope has his own, quite different agenda — which has nothing to do with our forgettable ideological debates. It is always revolutionary, and confusing to the faithful, when a religious leader believes that the Sabbath (including all the rules and institutions of religion) was made for man, and not the other way around.

is pope francis smart, conservative, or something less familiar?


NYTimes |  In Pope Francis’ most significant move yet to reshape the leadership of the Roman Catholic Church in the United States, Blase J. Cupich took his seat in Chicago on Tuesday as archbishop of the nation’s third-largest Catholic archdiocese and called on the church not to be afraid of change.

In a multilingual installation Mass at Holy Name Cathedral, with American bishops, his large extended family and Mayor Rahm Emanuel looking on, Archbishop Cupich was handed the golden crosier, a shepherd’s staff, that belonged to a powerful liberal predecessor, Cardinal George Mundelein, who became archbishop of Chicago 99 years ago and served for 24 years.

“We as a church should not fear leaving the security of familiar shores, the peacefulness of the mountaintop of our self-assuredness, but rather walk into the mess,” Archbishop Cupich said in an upbeat and plain-spoken homily.

With Archbishop Cupich now seated, Pope Francis gets a media-savvy American communicator in tune with his message of reinvigorating the church by stressing mercy over judgmentalism, change over stasis, and the imperative for all Catholics to go to the margins of society to serve the poor, migrants and those without hope. It is a message that not every bishop has enthusiastically embraced.

contempt for conservatism, religion, and history among social scientists..,


cs.nyu.edu |  The analysis in (Duarte et al. 2014) of the bias in the social sciences against conservatives and conservatism is important and timely. (I am not at all competent to evaluate either the effectiveness or the reasonableness of their proposed remedies, and therefore will not discuss them.)  

If anything, the article understates the blatant, explicit contempt shown to conservative views in the published scientific literature. For instance, Wilson, Ausman, and Mathews (1973) wrote: The “ideal” conservative is characterized as conventional, conforming, antihedonistic, authoritarian, punitive, ethnocentric, militaristic, dogmatic, superstitious, and antiscientific.

After that long stream of insults, the reader is quite prepared to be told that conservatives also smell funny. Obviously, most or all of those adjectives could be have been replaced with equally accurate adjectives of a neutral or positive valence e.g. respectful of tradition, uneccentric, abstemious, forceful, stern, patriotic, and so on. They proceed to characterize conservatism as a “syndrome” that has to be “explained” whereas, by implication, being liberal is just the way normal people are, and as such demands no particular explanation. (Stankov (2008) also refers to the “Conservative syndrome”.) This kind of language would be appropriate for an article in Mother Jones, but seems entirely out of place in a scientific paper in Journal of Personality and Social Psychology.

Duarte et al. are primarily concerned with the damage that this kind of bias does to social science research. Equally or more important is the poisonous impact of these kinds of publications on the state of political discourse, particularly when echoed gleefully in articles in liberal publications such as (Mooney 2014). First, such claims obviously increase the dislike and distrust of scientists and science among conservatives, and the sense that the pronouncements of science are merely a liberal conspiracy. Second, the last thing that liberals in this country need at this time — and I write as a dyed-in-the-wool liberal — is more reasons to feel smugly superior. Third and most importantly, democracy is based on political discourse, and meaningful political discourse depends on, to some extent, taking what your opponent says seriously and engaging with it on that basis. If liberals believe that conservative opinions are atavistic remnants of attitudes that were adaptive when we were all living in caves or on the savannah, and can therefore be dismissed out of hand, then no serious discourse is possible.

If these negative views of conservatism were in fact entirely valid, then the scientific community would be in the difficult position of balancing the scientists’ commitment to truth against the good of society. However, since, as Duarte et al. demonstrate at length and in detail, they are certainly one-sided, often exaggerated, and sometimes false, there is no justification for it.

In this note, I want to add to the argument in Duarte et al. by making two further points. First, parallel to the contempt for conservatism, and related to it, is a pervasive contempt for religion in psychological studies of religious belief. This connection is explicit in works such as Kanazawa’s (2010) paper, “Why Liberals and Atheists are More Intelligent”. Second, the contemptuous views of both conservatism and religion are exacerbated by a lack of historical perspective and an uninterest in historical accuracy. I conclude with some general comments about the risks attendant in this kind of research and the caution that needs to be exercised.

you know who you are...,


WaPo |  Some people just can't seem to keep a beat.

You know the ones: They seem to be swaying to their own music or clapping along to a beat only they can hear. You may even think that describes you.

The majority of humans, however, do this very well. We clap, dance, march in unison with few problems; that ability is part of what sets us apart from other animals.

But it is true that rhythm — specifically, coordinating your movement with something you hear — doesn't come naturally to some people. Those people represent a very small sliver of the population and have a real disorder called "beat deafness."

Unfortunately, your difficulty dancing or keeping time in band class probably doesn't quite qualify.
A new study by McGill University researchers looked more closely at what might be going on with "beat deaf" individuals, and the findings may shed light on why some people seem to be rhythm masters while others struggle.

processing structure in language and music


springer |  The relationship between structural processing in music and language has received increasing interest in the past several years, spurred by the influential Shared Syntactic Integration Resource Hypothesis (SSIRH; Patel, Nature Neuroscience, 6, 674–681, 2003). According to this resource-sharing framework, music and language rely on separable syntactic representations but recruit shared cognitive resources to integrate these representations into evolving structures. The SSIRH is supported by findings of interactions between structural manipulations in music and language. However, other recent evidence suggests that such interactions also can arise with nonstructural manipulations, and some recent neuroimaging studies report largely nonoverlapping neural regions involved in processing musical and linguistic structure. These conflicting results raise the question of exactly what shared (and distinct) resources underlie musical and linguistic structural processing. This paper suggests that one shared resource is prefrontal cortical mechanisms of cognitive control, which are recruited to detect and resolve conflict that occurs when expectations are violated and interpretations must be revised. By this account, musical processing involves not just the incremental processing and integration of musical elements as they occur, but also the incremental generation of musical predictions and expectations, which must sometimes be overridden and revised in light of evolving musical input.