Thursday, April 10, 2014

what was the legal issue again? dickishness takes on a life of its own...,


theblaze |  During a contentious congressional hearing on Tuesday, Attorney General Eric Holder disdainfully told Rep. Louie Gohmert (R-Texas) “good luck with your asparagus.”

Many, including TheBlaze, assumed Holder was mocking Gohmert for seemingly fumbling his words back in 2013 when he said, “The attorney general will not cast aspersions on my asparagus!”

Gohmert was ridiculed at the time by Comedy Central’s “The Colbert Report,” the Washington Post, the U.K. Guardian and more for the “famously embarrassing” moment.

But Gohmert told Glenn Beck on Wednesday that he did not fumble his words back in 2013, and was in fact using a quote that goes back decades.

“Percy Foreman was a very, very liberal criminal defense attorney, but he was incredible in the courtroom,” Gohmert said on Beck’s radio show. “When somebody started attacking his integrity, he stood up and said, ‘I object, he’s casting aspersions on my asparagus!’ And people would scratch their heads, but it brought down the level of the rancor. I was using a Percy Foreman line from criminal trials back probably 50 years ago.”

Other research confirms that the line was used in decades past. A 1973 book by John Dos Passos includes a letter where an individual says, “don’t think that I’m ‘casting asparagus’…”

parliamentary theatrical dickishness signifying nothing...,


theblaze |  Last year, Holder and Gohmert went head-to-head following the Boston Marathon bombings. That exchange ended with the Texas representative tripping over his words and declaring: “The attorney general will not cast aspersions on my asparagus!”

Gohmert’s critics quickly seized on his bungled statement and it became the butt end of jokes in certain corners.

Now, fast-forward to April 2014, and Holder and Gohmert are at it again. This time, however, instead of arguing about the Boston terror attacks, the two argued over Gohmert’s claim that the DOJ has ignored requests to turn over certain documents to congressional investigators.

“I think what we promised to do is to provide you and your staff with,” Holder began to say in the hearing.

“Sir, I’ve read you what your department promised and it is inadequate, and I realize that contempt is not a big deal to our attorney general, but it is important that we have proper oversight,” Gohmert interjected.

“You don’t want to go there, okay?” an angry Holder said.

“I don’t want to go there? About the contempt?” Gohmert asked.

“You should not assume that is not a big deal to me. I think that it was inappropriate, I think it was unjust. But never think that was not a big deal to me. Don’t ever think that,” Holder said.


prologue...,


Our clip begins after Attorney General Eric Holder addresses Rep. Louie Gohmert in a House Judiciary hearing. Holder says something like 'You don't know what you're talking about.' Quote:

"You don't know what the FBI did. You don't know what the FBI's interaction was with the Russians. You don't know what questions were put to the Russians, whether those questions were responded to. You simply do not know that. And you have characterized the FBI as being not thorough, or taken exception to my characterization of them as being thorough. I know what the FBI did. You cannot know what I know. That is all."

With that, Louie grows furious. This is where the clip starts, as Louie has literally turned red. He wants the AG's head but -- NO! -- Louie's time has expired. So he attempts to counter-attack Holder using a parliamentary procedure, the "Point of Personal Privilege." Unfortunately the "Privilege" can't be used to act a Texas derpass. But Louie doesn't know it, or doesn't care, so on he blindly rants while the committee members ask him repeatedly to shut it. Finally, after growing fully crimson and unhinged, he blurts out the most magnificent defense of a dumb-dog congressman ever:

"HE TURN AND CAST ASPERSIONS ON MY ASPARAGUS."

Wednesday, April 09, 2014

without femtoaggressions to give it value, purpose, and meaning, the cathedral will implode


theatlantic | The study that might have put to rest much of the recent agitation about microaggressions has unfortunately never been published. Microaggressions, for those who are not up on the recent twists and turns of American public discourse, are the subtle prejudices found even in the most liberal parts of our polity. They are revealed when a lecturer cites mainly male sources and no gay ones, when we use terms such as “mankind,” or when we discuss what Michelle Obama wore when she visited pandas in China, something we would not note about a man. In short, they are the current obsession of political correctness squads.

The study that I believe could have helped a great deal was conducted by a research assistant of mine at Columbia University who disappeared before she completed her Ph.D. Carolyn (I am withholding her last name in order to acknowledge her without embarrassing her) asked members of 80 groups in New York City what they felt about other such groups. She avoided broad strokes and asked not about divisions between black and white, but what African Americans felt about Africans from Nigeria and blacks from the West Indies. She asked Hispanics about Dominicans, Haitians, Mexicans, and Cubans, and so on.

What Carolyn found was that there was little love lost between any two groups. Members of all the 80 groups she studied attached all kind of unflattering labels to members of other groups, even if they were of the same race or ethnic group. When she interviewed members of subgroups, they were unsparing about each other. German Jews felt that Jews of Polish origin were very uncouth (and surely would not want their daughter to marry one or to share a synagogue with them). The Polish Jews, in turn, felt that those of German background were stuck up and “assimilated,” and hence one was best off crossing to the other side of the street if they neared. Iraqis from Basra considered those from Baghdad to be too modern, and those from Baghdad considered their brothers and sisters from Basra as provincial—and so on and so on. Today they would all be called at least microaggressive.

None of this is surprising to sociologists, who have long held that one major way community cohesion is promoted is by defining it against out-groups—and that there is a strong psychological tendency to attribute positive adjectives to an in-group and negatives on to the outsiders. In short, it’s part—not a pretty part—of human nature, or at least social nature. Choose any group and you will find its members griping about all the others.

I hence urge those who are troubled by the ways others talk about them to use Carolyn’s findings as a baseline. That is, not to ignore slurs and insults, and most certainly not racial, ethnic, or any other kind of prejudices, but merely to “deduct” from them what seems to be standard noise, the normal sounds of human rambling. We may wish for a world in which people say only kind things about each other, but until we get there, we should not take umbrage at every negative note or adjective that is employed. For now, that is something most of us do—yes, I suspect even those who rail against microaggression.

der NarziƟmus der kleinen Differenzen


wikipedia | The narcissism of small differences is a term that describes 'the phenomenon that it is precisely communities with adjoining territories, and related to each other in other ways as well, who are engaged in constant feuds and ridiculing each other' - 'such sensitiveness [...] to just these details of differentiation'.[1]
 
The term was coined by Sigmund Freud in 1917, based on the earlier work of British anthropologist Ernest Crawley: 'Crawley, in language which differs only slightly from the current terminology of psychoanalysis, declares that each individual is separated from others by a "taboo of personal isolation"...this "narcissism of minor differences"'.[2]

The term appeared in Civilization and Its Discontents (1929–1930) in relation to the application of the inborn aggression in man to ethnic (and other) conflicts - a process still considered by Freud, at that point, as 'a convenient and relatively harmless satisfaction of the inclination to aggression'.[3]

For Lacanians, the concept clearly related to the sphere of the Imaginary: 'the narcissism of small differences, which situates envy as the decisive element...in issues that involve narcissistic image'.[4]

Glen O. Gabbard, M.D. suggested Freud's "Narcissism of Small Difference" provides a framework within which to understand that, in a love relationship, there can be a need to find, and even exaggerate, differences in order to preserve a feeling of separateness and self.[5]

In terms of postmodernity, consumer culture has been seen as predicated on 'the "narcissism of small differences"...to achieve a superficial sense of one's own uniqueness, an ersatz sense of otherness which is only a mask for an underlying uniformity and sameness'.[6]

Tuesday, April 08, 2014

college coach pay...,


slate | Perhaps nobody has profited quite so handsomely from the college sports arms race as the top coaches in NCAA football and basketball, who routinely pull down seven-figure paydays.

But who knew that these were boom times for golf and tennis coaches too?

The American Association of University Professors is out with its latest annual report on the economic health of its members' profession. Executive summary: It’s pretty weak. But this year, the AAUP has added a fun little wrinkle by comparing the growth of academic and sports spending. Particularly intriguing is this chart contrasting pay growth for faculty and head coaches. In the top football and basketball divisions, the median inflation-adjusted pay for head coaches roughly doubled between 2005-06 and 2011-12. But coaches in minor sports didn’t do so badly either: In D1-AA golf and tennis, pay packages grew by 79 and 53 percent, respectively. Meanwhile, even at doctoral institutions (shown here as category I), professors only managed a 4 percent real raise over that time.

that's why you steal nutella out the cafeteria and keep you some hot pockets...,


HuffPo |  His on-court skills have generated a lot of money, but University of Connecticut basketball player Shabazz Napier says there are nights he can't afford food.

Speaking to reporters late last month, the point guard reflected on the relationship between student athletes and the schools they attend. He then offered up a startling example of disconnect between the two.

"We're definitely blessed to get scholarships to our universities, but at the end of the day, that doesn't cover everything," Napier told a group of reporters, adding later in the conversation, "I don't think student athletes should get hundreds of thousands of dollars, but ... there are hungry nights that I go to bed and I'm starving."

In an email to The Huffington Post, a University of Connecticut spokesman said all of the school's scholarship athletes, including Napier, receive the maximum meal plan allowed under NCAA rules.
"UConn does not have a cafeteria devoted specifically to student-athletes," he added, "but they have access to the same cafeterias which are available to all our students."

Napier's comments come shortly after the National Labor Relations Board endorsed Northwestern University football players looking to unionize.

“The players spend 50 to 60 hours per week on their football duties during a one-month training camp prior to the start of the academic year and an additional 40 to 50 hours per week on those duties during the three or four month football season,” a copy of the NLRB ruling explains. “Not only is this more hours than many undisputed full-time employees work at their jobs, it is also many more hours than the players spend on their studies.”

On Sunday, NCAA president Mark Emmert called the effort "a grossly inappropriate solution to the problems" that exist in intercollegiate athletics. He said a "union-employee model" would "throw away the entire collegiate model for athletics."

the myth of working your way through college...,


theatlantic |  A lot of Internet ink has been spilled over how lazy and entitled Millennials are, but when it comes to paying for a college education, work ethic isn't the limiting factor. The economic cards are stacked such that today’s average college student, without support from financial aid and family resources, would need to complete 48 hours of minimum-wage work a week to pay for his courses—a feat that would require superhuman endurance, or maybe a time machine.

To take a close look at the tuition history of almost any institution of higher education in America is to confront an unfair reality: Each year’s crop of college seniors paid a little bit more than the class that graduated before. The tuition crunch never fails to provide new fodder for ongoing analysis of the myths and realities of The American Dream. Last week, a graduate student named Randy Olson listened to his grandfather extol the virtues of putting oneself through college without family support. But paying for college without family support is a totally different proposition these days, Olson thought. It may have been feasible 30 years ago, or even 15 years ago, but it's much harder now. 

He later found some validation for these sentiments on Reddit, where one user had started a thread about the increasing cost per course at Michigan State University. MSU calculates tuition by the "credit hour," the term for the number of hours spent in a classroom per week. By this metric, which is used at many U.S. colleges and universities, a course that's worth three credit hours is a course that meets for three hours each week during the semester. If the semester is 15 weeks long, that adds up to 45 total hours of a student's time. The Reddit user quantified the rising cost of tuition by cost per credit hour:
This is interesting. A credit hour in 1979 at MSU was 24.50, adjusted for inflation that is 79.23 in today dollars. One credit hour today costs 428.75.

if college is only exploratory and family connections ensure future job prospects...,


theatlantic | There is a widespread belief that humanities Ph.D.s have limited job prospects. The story goes that since tenure-track professorships are increasingly being replaced by contingent faculty, the vast majority of English and history Ph.D.s now roam the earth as poorly-paid adjuncts or, if they leave academia, as baristas and bookstore cashiers. As English professor William Pannapacker put it in Slate a few years back, “a humanities Ph.D. will place you at a disadvantage competing against 22-year-olds for entry-level jobs that barely require a high-school diploma.” His advice to would-be graduate students was simple: Recognize that a humanities Ph.D is now a worthless degree and avoid getting one at all cost.

Since most doctoral programs have never systematically tracked the employment outcomes of their Ph.D.s, it was hard to argue with Pannapacker when his article came out. Indeed, all anecdotal evidence bade ill for humanities doctorates. In 2012, the Chronicle of Higher Education profiled several humanities Ph.D.s who were subsisting on food stamps. Last year, the Pittsburgh Post-Gazette eulogized Margaret Mary Vojtko, an 83-year-old French adjunct who died in abject poverty after teaching for more than two decades at Dusquesne University, scraping by on $25,000 a year before being unceremoniously fired without severance or retirement pay.

Recent studies suggest that these tragedies do not tell the whole story about humanities Ph.D.s. It is true that the plate tectonics of academia have been shifting since the 1970s, reducing the number of good jobs available in the field: “The profession has been significantly hollowed out by the twin phenomena of delayed retirements of tenure-track faculty and the continued ‘adjunctification’ of the academy,” Andrew Green, associate director at the Career Center at the University of California, Berkeley, told me. In the wake of these changes, there is no question that humanities doctorates have struggled with their employment prospects, but what is less widely known is between a fifth and a quarter of them go on to work in well-paying jobs in media, corporate America, non-profits, and government. Humanities Ph.D.s are all around us— and they are not serving coffee.

Monday, April 07, 2014

Bro.Feed Not Scurred, Politically Correct, or Intellectually Consti(err...., nevermind.)

whatever you think of cultural non-judgementalism in the abstract, what it has meant in reality is that the pathologies of lower-class black culture have been played out on other black folks and on non-black neighbors.
THROW OUT DAMNED "NON-JUDGMENTALISM"!!!!!! (Subjective Opinions)

REPLACE IT WITH MEASUREMENT!!!!!

GIVE The "Black Racial Services Machine" and the "Post-Racial Progressive Fundamentalist Alliance" - both of whom have FUSED The "Americanized Negro's "Black Agenda" with PROGRESSIVE POLITICAL (Only) Expression THESE MANDATES!!!!

1) INCREASE the Black graduation rate to the point at which the new market of TECHNICALLY EDUCATED BLACKS can be a DISRUPTIVE MARKET FORCE in the key areas within the Black Community Marketplace (Health Care, Business Management, etc)

2) LOWER THE BLACK HOMICIDE VICTIMIZATION RATE from 47% down to 25% IN 5 YEARS, moving it closer to our 14% POPULATION PROPORTION

3) GROW THE LOCAL BLACK ECONOMY BY 50% IN 5 YEARS, where its ability to CREATE JOBS, maintain LOCAL TAX BASE, service the retail and service needs of the people in a 4 mile radius

4) REDUCE THE NUMBER OF 'OUT OF RESIDENCE BLACK FATHERS' DOWN TO 20% - as the Black Community acknowledges the critical importance of having a BLACK MAN AND WOMAN working collaboratively to care for their children.   RECHRISTINE "The Black Church" from its claim to having been created in response to WHITE SUPREMACY (what Rev Raphael Warnock of Ebenezer Church in Atlanta said) OVER TO the CENTERPIECE OF THE BLACK COMMUNITY SOCIAL SERVICES STRUCTURE

CNu - IF the AMERICANIZED NEGRO'S "Struggle Motion" is not REMOVED FROM STRAIGHT UP POLITICAL OPPORTUNISM these CORRUPT NEGROES on MSNBC will be allowed to retain their platform and MOLEST THE AMERICANIZED NEGRO'S AGENDA - for ONLY the benefit of PROGRESSIVISM - the condition of the Negro BE DAMNED. 
In the absence of Jim Crow, America has organically resegregated itself with a vengeance - with managerial and professional class blacks spatially and socially distancing themselves from lower-class black cultural pathology.
WHO SAYS that the MOST ACTIVE VOICES representing the Americanized Negro's interests BELIEVES THAT "JIM CROW IS DEAD TODAY"?

Do YOU believe that Michelle Alexander believes that the presence of AG Eric Holder, his boss Barack Obama and a DISTRICT ATTORNEY and SHERIFF (jailer) and POLICE CHIEF who was appointed by a MAYOR - ALL OF WHOM BLACK VOTED FOR means that THE NEW JIM CROW IS DEAD?

The one thing that I learn from:
* Reading "The Final Call"
* Listening to "Black Progressive Radical Radio"
* Watching Videos ...................IS..............

THERE IS NO CORRELATION between the PRESENCE OF ESTABLISHMENT FORCES THAT THE AMERICANIZED NEGRO VOTED FOR BEING IN POWER and the RHETORIC that the RADICAL REVOLUTIONARY NEGROES will tell their CONGREGATION is still out there LURKING, hoping to cut every Black person's throats IF we don't remain UNIFIED.

The final irony is - that Ras Baraka - son of  Amari Baraka is poised to become Mayor Of Newark.
Amari Baraka told of the day that the last White mayor of Newark pulled him out of jail and brought him to the mayors OFFICE where he was BEATEN-UP ON THE FLOOR.

The REVOLUTIONARY /RADICALS have ultimately become POLITICAL CANDIDATES (see Cheway Lumumba).

And now the biggest threat facing Ras Baraka is that HIS CAMPAIGN BUS WAS BURNED by an operative from his BLACK DEMOCRATIC opponent. 

UNTIL BLACK PEOPLE REGULATE THE ACCESS THAT POLITICS HAVE TO THEIR "CONSCIOUSNESS ABOUT THEMSELVES" - THE TRUTH THAT "YOU DON'T HAVE TO DEVELOP THE NEGRO THROUGH POLITICS" - YOU ONLY NEED TO KEEP HIS ATTENTION ALIGNED WITH 'YOUR TEAM' IS ALL THAT MATTERS FOR YOU TO HAVE SUCCESS AT HIS EXPENSE.

the answers ARE knowable, but not if you consult the politically correct and intellectually constipated academy


Coates’ primary purpose is to defend lower-class black culture as one norm among others; Chait’s purpose originally was to make a pro-Obama point, but then became more ‘yes things have changed since slavery actually’ (a practical political position) Other commentators, taking their cue from Charles Murray, look to de-racialize the conversation so as to make it about the relationship between poverty and dysfunction, yet another practical political position.

What I am getting from Coates’ argument (and I don’t think Chait is picking up on) is that he believes that white middle class norms and poor black norms are different but equal cultural norms. Assuming that middle class norms are superior to black lower-class norms is therefore a value judgement. Suggesting that lower-class black people throw aside the norms that are ‘theirs’ to take up white middle class norms is therefore an act of white supremacy, an imposition of ‘foreign’ norms on the weaker party.

This is a novel argument but not really a politically practical one. Come to think of it, it's not only impractical, I’d say it was politically self destructive since whatever you think of cultural non-judgementalism in the abstract, what it has meant in reality is that the pathologies of lower-class black culture have been played out on other black folks and on non-black neighbors. In the absence of Jim Crow, America has organically resegregated itself with a vengeance - with managerial and professional class blacks spatially and socially distancing themselves from lower-class black cultural pathology.

The turd in the "can't we all just get along" punch bowl is, that, no one with common sense is able to ignore the very high rate at which lower-class blacks victimize themselves and others. It's not simply a matter of rudeness, incivility, and lack of regard for the commons. If one wanted to be generous about pathological lower class black culture, one could say that the "warrior spirit" has the ratchets "going for theirs". One might say that about the lower class ratchets. One might even romanticize the ratchets as 21st century pirates. But then, with regard to pirates and pirate culture, the answers to those questions are historically knowable. As we now know, pirates had far more common sense and practical good judgement when it came to committing acts of irrational violence and incivility within their own ranks...,

irrational violence of individuals against each other is detrimental to the profitable enterprise

scientificamerican | From countless films and books we all know that, historically, pirates  were criminally insane, traitorous thieves, torturers and terrorists. Anarchy was the rule, and the rule of law was nonexistent.

Not so, dissents George Mason University economist Peter T. Leeson in his myth-busting book, The Invisible Hook (Princeton University Press, 2009), which shows how the unseen hand of economic exchange produces social cohesion even among pirates. Piratical mythology can’t be true, in fact, because no community of people could possibly be successful at anything for any length of time if their society were utterly anarchistic. Thus, Leeson says, pirate life was “orderly and honest” and had to be to meet buccaneers’ economic goal of turning a profit. “To cooperate for mutual gain—indeed, to advance their criminal organization at all—pirates needed to prevent their outlaw society from degenerating into bedlam.” There is honor among thieves, as Adam Smith noted in The Theory of Moral Sentiments: “Society cannot subsist among those who are at all times ready to hurt and injure one another.... If there is any society among robbers and murderers, they must at least ... abstain from robbing and murdering one another.”

Pirate societies, in fact, provide evidence for Smith’s theory that economies are the result of bottom-up spontaneous self-organized order that naturally arises from social interactions, as opposed to top-down bureaucratic design. Just as historians have demonstrated that the “Wild West” of 19th-century America was a relatively ordered society in which ranchers, farmers and miners concocted their own rules and institutions for conflict resolution way before the long arm of federal law reached them, Leeson shows how pirate communities democratically elected their captains and constructed constitutions. Those documents commonly outlined rules about drinking, smoking, gambling, sex (no boys or women allowed onboard), use of fire and candles, fighting and disorderly conduct, desertion and shirking one’s duties during battle. (The last could lead to the “free rider” problem in which the even division of loot among uneven efforts leads to resentment, retaliation and economic chaos.) Enforcement was key. Just as civil courts required witnesses to swear on the Bible, pirate crews had to consent to the captain’s codes before sailing. In the words of one observer: “All swore to ’em, upon a Hatchet for want of a Bible. When ever any enter on board of these Ships voluntarily, they are obliged to sign all their Articles of Agreement ... to prevent Disputes and Ranglings afterwards.” Thus, the pirate code “emerged from piratical interactions and information sharing, not from a pirate king who centrally designed and imposed a common code on all current and future sea bandits.”

the steadily increasing probability of death camps...,

antipope |  the success of a social system can be measured by how well it supports those at the bottom of the pile—the poor, the unlucky, the non-neurotypical—rather than by how it pampers its billionaires and aristocrats. By that rule of thumb, western capitalism did really well throughout the middle of the 20th century, especially in the hybrid social democratic form: but it's now failing, increasingly clearly, as the focus of the large capital aggregates at the top (mostly corporate hive entities rather than individuals) becomes wealth concentration rather than wealth production. And a huge part of the reason it's failing is because our social system is set up to provide validation and rewards on the basis of an extrinsic attribute (what people do) which is subject to external pressures and manipulation: and for the winners it creates incentives to perpetuate and extend this system rather than to dismantle it and replace it with something more humane.

Meanwhile, jobs: the likes of George Osborne (mentioned above), the UK's Chancellor of the Exchequer, don't have "jobs". Osborne is a multi-millionaire trust-fund kid, a graduate of Eton College and Oxford, heir to a Baronetcy, and in his entire career spent a few working weeks in McJobs between university and full-time employment in politics. I'm fairly sure that George Osborne has no fucking idea what "work" means to most people, because it's glaringly obvious that he's got exactly where he wanted to be: right to the top of his nation's political culture, at an early enough age to make the most of it. Like me, he has the privilege of a job that passes test (a): it's good for him. Unlike me ... well, when SF writers get it wrong, they don't cause human misery and suffering on an epic scale; people don't starve to death or kill themselves if I emit a novel that isn't very good. 

When he prescribes full employment for the population, what he's actually asking for is that the proles get out of his hair; that one of his peers' corporations finds a use for idle hands that would otherwise be subsisting on Jobseekers Allowance but which can now be coopted, via the miracle of workfare, into producing something for very little at all. And by using the threat of workfare, real world wages can be negotiated down and down and down, until labour is cheap enough that any taskmaster who cares to crack the whip can afford as much as they need. These aren't jobs that past test (a); for the most part they don't pass test (b) either. But until we come up with a better way of allocating resources so that all may eat, or until we throw off the shackles of Orwellian Crimestop and teach ourselves to think directly about the implications of wasting a third of our waking lives on occupations that harm ourselves and others, this is what we're stuck with ...

Sunday, April 06, 2014

the cultural cognition project


culturalcognition |  “Cultural cognition"refers to the tendency of individuals to form beliefs about societal dangers that reflect and reinforce their commitments to particular visions of the ideal society. Cultural cognition is one of a variety of approaches developed for empirical testing of the "cultural theory of risk" associated with Mary Douglas and Aaron Wildavsky. This chapter (from the Handbook of Risk Theory, Springer Pub.) discusses the distinctive features of cultural cognition as a conception of cultural theory, including its cultural worldview measures; its emphasis on social psychological mechanisms that connect individuals' risk perceptions to their cultural outlooks; and its practical goal of enabling self-conscious management of popular risk perceptions in the interest of promoting scientifically sound public policies that are congenial to persons of diverse outlooks.

Related video: lecture on the cultural cognition of risk

Saturday, April 05, 2014

Great Debate Transcending Our Origins: Violence, Humanity, and the Future



asu |  Celebrate five years of intellectual stimulation and excitement with the Origins Project at Great Debate Transcending Our Origins: Violence, Humanity, and the Future. Saturday, April 5, 2014 - 7:00pm

The first panel of the evening, The Origins of Violence, will feature scholars and writers Steven Pinker, Richard Wrangham, Erica Chenoweth, Adrian Raine, John Mueller and Sarah Mathew discussing the development of violence from the brain to world wars. 

The second panel, The Future: From Medicine and Synthetic Biology to Machine Intelligence, will feature scientists and notable experts Richard Dawkins, Craig Venter, Kim Stanley Robinson, Esther Dyson, Eric Horvitz, George Poste and Randolph Nesse discussing the future of new biomedical and robotic technologies and their impact on humanity. The evening will be moderated by Lawrence Krauss.

Friday, April 04, 2014

inclusive fitness 50 years on...,


royalsociety | The cardinal problem of evolutionary biology is to explain adaptation, or the appearance of design in the living world [1,2]. Darwin [3] convincingly argued that the process of adaptation is driven by natural selection: those heritable variations—i.e. genes—that are associated with greater individual reproductive success are those that will tend to accumulate in natural populations. To the extent that the individual's genes are causally responsible for her improved fitness, natural selection leads to the individual appearing designed as if to maximize her fitness. Thus, Darwinism is a theory of both the process and the purpose of adaptation. 

However, correlations between an individual's genes and her fitness need not reflect a direct, causal relationship. For example, genes for altruism can be associated with greater fitness, despite the direct cost that they inflict on their bearer, if relatives interact as social partners. This is because an individual who carries genes for altruism will tend to have more altruistic social partners. That altruism can be favoured by natural selection suggests that the purpose of adaptation is not, in general, to maximize the individual's personal fitness [4]. 

Although Darwin [3] recognized the potential for such indirect effects to drive the evolution of social behaviours, discussing the logic of kin selection theory in connection with the adaptations of sterile insect workers, it was William D. Hamilton (figure 1), more than a century later, who developed these insights into a full mathematical theory. By quantifying the relative strengths of direct selection, acting via the individual's own reproduction, and indirect selection, acting via the reproduction of the individual's relatives, Hamilton [4] revealed the ultimate criterion that natural selection uses to judge the fate of genes. 

Hamilton's rule states that any trait—altruistic or otherwise—will be favoured by natural selection if and only if the sum of its direct and indirect fitness effects exceeds zero [47]. That is Graphic where –c is the impact that the trait has on the individual's own reproductive success, bi is its impact on the reproductive success of the individual's ith social partner and ri is the genetic relatedness of the two individuals. This mathematical partition of fitness effects underpins the kin selection approach to evolutionary biology [8]. The general principle is that with regards to social behaviours, natural selection is mediated by any positive or negative consequences for recipients, according to their genetic relatedness to the actor. Consequently, individuals should show greater selfish restraint, and can even behave altruistically, when interacting with closer relatives [4]. 

Having clarified the process of social adaptation, Hamilton [4] revealed its true purpose: to maximize inclusive fitness (figure 2). That is, Darwinian individuals should strive to maximize the sum of the fitness effects that they have on all their relatives (including themselves), each increment or decrement being weighted by their genetic relatedness. This is the most fundamental revision that has been made to the logic of Darwinism and—aside from a possibly apocryphal quip attributed to J. B. S. Haldane, to the effect that he would give his life to save the lives of two brothers or eight cousins—it was wholly original to Hamilton.

microbial genes, brain & behaviour – epigenetic regulation of the gut–brain axis


wiley |  To date, there is rapidly increasing evidence for host–microbe interaction at virtually all levels of complexity, ranging from direct cell-to-cell communication to extensive systemic signalling, and involving various organs and organ systems, including the central nervous system. As such, the discovery that differential microbial composition is associated with alterations in behaviour and cognition has significantly contributed to establishing the microbiota–gut–brain axis as an extension of the well-accepted gut–brain axis concept. Many efforts have been focused on delineating a role for this axis in health and disease, ranging from stress-related disorders such as depression, anxiety and irritable bowel syndrome to neurodevelopmental disorders such as autism. There is also a growing appreciation of the role of epigenetic mechanisms in shaping brain and behaviour. However, the role of epigenetics in informing host–microbe interactions has received little attention to date. This is despite the fact that there are many plausible routes of interaction between epigenetic mechanisms and the host-microbiota dialogue. From this new perspective we put forward novel, yet testable, hypotheses. Firstly, we suggest that gut-microbial products can affect chromatin plasticity within their host's brain that in turn leads to changes in neuronal transcription and eventually alters host behaviour. Secondly, we argue that the microbiota is an important mediator of gene-environment interactions. Finally, we reason that the microbiota itself may be viewed as an epigenetic entity. In conclusion, the fields of (neuro)epigenetics and microbiology are converging at many levels and more interdisciplinary studies are necessary to unravel the full range of this interaction.

bringing pseudo-science into the science classroom


realclearscience | By stressing the importance of critical thinking and reasoned skepticism, groups like the New England Skeptical Society, the James Randi Educational Foundation, and the Center for Skeptical Inquiry constantly battle these forces of nonsense, but their labor all too often falls on deaf ears. It's time to take the problem of pseudoscience into the heart of American learning: public schools and universities.

Right now, our education system doesn't appear to be abating pseudoscientific belief. A survey published in 2011 of over 11,000 undergraduates conducted over a 22-year period revealed that nonscientific ways of thinking are surprisingly resistant to formal instruction.

"There was only a modest decline in pseudoscientific beliefs following an undergraduate degree, even for students who had taken two or three science courses," psychologists Rodney Schmaltz and Scott Lilienfeld said of the results.

In a new perspective published Monday in the journal Frontiers in Psychology, Schmaltz and Lilienfeld detail a plan to better instruct students on how to differentiate scientific fact from scientific fiction. And somewhat ironically, it involves introducing pseudoscience into the classroom.
The inception is not for the purpose of teaching pseudoscience, of course; it's for refuting it.
"By incorporating examples of pseudoscience into lectures, instructors can provide students with the tools needed to understand the difference between scientific and pseudoscientific or paranormal claims," the authors say.

Thursday, April 03, 2014

the dawn of monotheism revisited


solami |  §1     In 1961, the Egyptologist Sir Alan Gardiner set the stage for a more enlightening reading of humanity's record - in as much as it relied on Egypt's history and its King List as relayed to us notably by the historian Manetho. In his book "Egypt of the Pharaohs" (p.170), he declared: "Manetho's narrative represents the last stage of a process of falsifications which started within a generation after the triumph of Amosis" over the vilified Hyksos (1575-1550 Old Chronology, henceforth OC).  In this critical analysis, Gardiner has been supported by a growing number of scholars, some thus coming up with remarkable - even if occasionally conflicting - new ideas, theories and insights (1).  One of them, the Islamic scholar and Egyptologist Ahmed Osman, in one of his latest books "Moses Pharaoh of Egypt - the Mystery of Akhenaten Resolved", thus commented our tampered records:
         "Like the accounts of the historian Manetho, the Talmudic stories contain many distortions and accretions arising from the fact that they were transmitted orally for a long time before finally being set down in writing. Yet one can sense that behind the myths there must have lain genuine historical events that had been suppressed from the official accounts of both Egypt and Israel, but had survived in the memories of the generations" (p.24). "The Alexandrian Jews were naturally interested in Manetho's account of their historic links with Egypt, although they found some aspects of it objectionable. His original work therefore did not survive for long before being tampered with [2/3 of Zarathushtra's Avesta reportedly was even deliberately destroyed]" (p.27). And: "Yoyotte ... became one of the few to see through the 'embellishments' of the biblical account and identify the historical core of the story ..." (p.48).
§2     Thus, it was time someone went beyond mere bickering over the confusing King Lists. With his book "A Test of Time", the Assyrologist and Egyptologist David Rohl has presented an archeologically and astronomically supported NewChronology (henceforth NC, with the reign of Ramses II thus placed in the 10th century BC, i.e. dated some 350 years later than traditionally recorded, and the reign of Akhenaton beginning some 3025 years ago and overlapping the ascendancy of David as the successor to King Saul).  If independently confirmed in its key elements by further research, we would find ourselves at the threshold of a new era, providing startling synchronologies of the past to which we were blinded through our own shortcomings and not necessarily by design.  Recognizing this could have vast implications for the future far beyond the bedeviled Middle Eastern craddle where our monotheistic beliefs appear to have their common roots. Osman's comments on the above-quoted koranic and biblical texts may help us to get there:
         "... the Koran presents the confrontation in such a precise way that one wonders if some of the details were left out of the biblical account deliberately. Here Moses sounds less like a magician, more like someone who presents evidence of his authority that convinces the wise men of Egypt, who throw themselves at his feet and thus earn the punishment of [an imposter] Pharaoh. One can only suspect that the biblical editor exercised care to avoid any Egyptian involvement with the Israelite Exodus, even to the extent of replacing Moses by Aaron in the performance of the rituals. ... [During] their sed festival celebrations, Egyptian kings performed rituals that correspond to the 'serpent rod' and 'hand' rituals performed by Moses - and, in performing them, Moses was not using magic but seeking to establish his royal authority.
         I think the correct interpretation of these accounts [of the Bible and the Koran] is that, when Akhenaten was forced to abdicate, he must have taken his royal sceptre to Sinai with him. On the death of Horemheb, the last king of the Eighteenth Dynasty, about a quarter of a century later, he must have seen an opportunity to restore himself to the throne. No heir to the Tuthmosside kings existed and it was Pa-Ramses, commander of Horemheb's army and governor of Zarw, who had claim to the throne. Akhenaten returned to Egypt and the wise men were gathered in order to decide between him and Pa-Ramses. Once they saw the sceptre of royal authority and Akhenaten had performed the sed festival rituals - secret from ordinary citizens - the wise men bowed the knee in front of him, confirming that his was the superior right to the throne, but Pa-Ramses used his army to crush the rebels. Moses was allowed to leave again for Sinai, however, accompanied by the Israelites, his mother's relatives, and the few Egyptians who had been converted to the new [monotheistic] religion that he had attempted to force upon Egypt a quarter of a century earlier. In Sinai the followers of Akhenaten were joined subsequently by some bedouin tribes (the Shasu), who are to be identified as the Midianites of the Bible. No magic was performed, or intended, by Moses. The true explanation of the biblical story could only be that it was relating the polical challenge for power in a mythological way - and all the plagues of which we read were natural, seasonal events in Egypt in the course of every year. ..." (p.178f)
         "This would explain how a new version of the Osiris-Horus myth came into existence from the time of the Nineteenth Dynasty. Osiris, the King of Egypt, was said to have had to leave the country for a long time. On his eventual return he was assassinated by Set, who had usurped the throne, but Horus, the son of Osiris, confronted Set at Zarw and slew him. According to my interpretation of events, it was in fact 'Set' who slew 'Horus'; but their roles were later reversed by those who wished to believe in an eternal life for Horus [alternatively, if their roles were not reversed, that might support the idea that Moses/Akhenaton had a role to play in Canaan/Palestine in the post-exodus period]. This new myth developed to the point where Osiris/Horus became the principal god worshipped in Egypt in later times while Set was looked upon as the evil one. This myth could have been a popular reflection of a real historical event - a confrontation between Moses and Seti I on top of the mountain in Moab." (p.187f)
§3     These conclusions could go a long way to explain not only the developments which took place following the 18thDynasty but many of the undercurrents still gripping the Middle East. Particularly the relations between the descendents of the competitors to the Throne of Pharaoh,Ramses I and Akhenaton (Moses if, for this study, we were to follow Osman'sanalysis: 2); i.e. the Egyptians on the one side and the "Children of Israel" on the other who, both, would appear to be victims of imaginative falsifiers of history. And they would clear up many mysteries, if it were not for the uncertainties which persist on what happened at the end and after Akhenaton's 17 year reign, particularly whether, how and where Akhenaton lived on. So far, no archeological or historical evidence has become known which would undisputably attest to Akhenaton's death at a certain time and place.
 
§4     Until recently, most scholars tended to interprete the fragmentary archeological data as pointing to a violent death of what many describe as the "heretic Pharaoh" at his regnal year 17. And they mostly associated that end of the first monotheistic reign with the subsequent resurgence of the Amon cult all over Egypt. Initially, this seemed to be supported by esoteric sources, i.e. by currently living persons who, on the basis of personal reincarnation experiences, are said to have been contemporaries of Akhenaton (3). By their very nature, these subjective accounts are just that. They cannot be relied upon without corroberating data. Nevertheless, in the absence of more conclusive information - like contributions from other disciplines - they may constitute imaginative and sometimes helpful hints and pointers for further research and more enlightened analysis.
 
§5     Pointing notably to the striking parallels between Akhenaton's Great Hymn to Aton (Gardiner, p.225f) and Psalm 104, scholars suspect the authors of these texts to be identical. Moreover, the Encyclopaedia Judaica (vol.12, pp. 378, 388, 389, 390, 400) recorded :
         "No primary source of information on Moses exists outside the Bible. ... [In] the Haggadah's hymnic confession Dayyeinu, ... Israel's career from Egypt to the settlement is rehearsed in 13 stages without a reference to Moses. ... According to Artapanos, Moses ... was the first pilosopher, and invented a variety of machines for peace and war. He was also responsible for the political organization of Egypt (having divided the land into 36 nomes) ... According to Josephus, Moses was the most ancient of all legislators in the records of the world. Indeed, he maintains that the very word 'law' was unknown in ancient Greece (Jos., Apion 2:154). ...
         Hecataeus of Abdera presented Moses as the founder of the Jewish state, ascribing to him the conquest of Palestine and the building of Jerusalem and the Temple. He explained, in the Platonic manner, that Moses divided his people into 12 tribes, because 12 is a perfect number, corresponding to the number of months in the year (cf. Plato, Laws, 745b-d; Republic, 546b). ... Very curious is the legend recorded by Israel Lipschuetz b. Gedaliah (Tiferet Yisrael to Kid. end, n.77).  A certain King, having heard of Moses' fame, sent a renowned painter to portray Moses' features.
    On the painter's return with the portrait the king showed it to his sages, who unanimously proclaimed that the features portrayed were those of a degenerate [4]. The astonished king journeyed to the camp of Moses and observed for himself that the portrait did not lie. Moses admitted that the sages were right and that he had been given from birth many evil traits of character but that he had held them under control and succeeded in conquering them. This, the narrative concludes, was Moses' greatness, that, in spite of his tremendous handicaps, he managed to become the man of God. Various attempts have, in fact, been made by some rabbis to ban the further publication of this legend as a denigration of Moses' character."
§6     Together, these and other ancient voices are seen to support the view that the name Moses - incidently like that of Salomon - may be less the real name and more one which was adopted post-festum for the biblical editors' or their taskmasters' purposes. Could it be then that the person hidden behind the name-of-convenience of Moses is in fact Akhenaton? In that event, he would have lived beyond his reign in Egypt (similarities come to mind with Jesus’ alleged post-crucification life in Kashmir (India) and elswhere and how that persistently recurring story has been treated by the powers that be, i.e. by those who consider themselves as the gardiens of the Holy Grail).
 
§7     Sigmund Freud concluded in his 1939 book "Moses and Monotheism" that Moses was not an Israelite but an Egyptian whose teachings derived from Akhenaton's pure monotheism (which he had imposed for apparently imperative economic reasons). That, of course, would require rewriting those stories which ante-date the "exodus of the Israelites" - if that ever happened as such and was not in fact an exodus of monotheistic Egyptians - rather than one of slaves - to what may have been their Palestinian exile (essentially brought about by deseases and power struggles between factions associated with the legitimate, and on the other side with the illegitimate contender to the Throne of Pharaoh?).

publicly funded schools where creationism is taught...,


slate |  A large, publicly funded charter school system in Texas is teaching creationism to its students, Zack Kopplin recently reported in Slate. Creationist teachers don’t even need to be sneaky about it—the Texas state science education standards, as well as recent laws in Louisiana and Tennessee, permit public school teachers to teach “alternatives” to evolution. Meanwhile, in Florida, Indiana, Ohio, Arizona, Washington, D.C., and elsewhere, taxpayer money is funding creationist private schools through state tuition voucher or scholarship programs. As the map below illustrates, creationism in schools isn’t restricted to schoolhouses in remote villages where the separation of church and state is considered less sacred. If you live in any of these states, there’s a good chance your tax money is helping to convince some hapless students that evolution (the basis of all modern biological science, supported by everything we know about geology, genetics, paleontology, and other fields) is some sort of highly contested scientific hypothesis as credible as “God did it.”

Wednesday, April 02, 2014

believe it or not: a narrative antidote to daystarism....,




Forbes | This article appears in the July 16 issue of Forbes magazine as a sidebar to “World Bank Mired in Dysfunction.”

The World Bank is a place where whistle-blowers are shunned, persecuted and booted–not always in that order.

Consider John Kim, a top staffer in the bank’s IT department, who in 2007 leaked damaging documents to me after he determined that there were no internal institutional avenues to honestly deal with wrongdoing. “Sometimes you have to betray your country in order to save it,” Kim says.

In return bank investigators probed his phone records and e-mails, and allegedly hacked into his personal AOL account. After determining he was behind the leaks the bank put him on administrative leave for two years before firing him on Christmas Eve 2010.

With nowhere to turn Kim was guided into the offices of the Washington, D.C.-based Government Accountability Project–the only game in town for public-sector leakers. “Whistle-blowers are the regulators of last resort,” says Beatrice Edwards, the executive director of the group. Edwards helped Kim file an internal case for wrongful termination (World Bank staffers have no recourse to U.S. courts) and in a landmark ruling a five-judge tribunal eventually ­ordered the bank to reinstate him last May. Despite the decision, the bank retired him in September after 29 years of service.

The U.S. is beginning to notice. The Senate Foreign Relations Committee insisted on inserting a whistle-blowing clause in 2011 after World Bank President Robert Zoellick approached them for an increase in the bank’s capital. But ­because of the supranational structure of the bank, the Senate’s demands are ultimately toothless.

“We can’t legislate the bank,” explains a Senate staffer. “All we can do is say, ‘We’ll give you this [money] if you do that [whistleblower protection].’ But they say, ‘You can’t make us do that because we can’t answer to 188 different countries.’ ”

daystar?


npr | Flip on television at any hour of the day and you'll likely see the elements of modern televangelism: a stylish set, an emotional spiritual message and a phone number on the screen soliciting donations.

Based in a studio complex between Dallas and Fort Worth, Texas, and broadcasting to a potential audience of 2 billion people around the globe, Daystar calls itself the fastest growing Christian television network in the world.

The Internal Revenue Service considers Daystar something else: a church.

Televangelists have a choice when they deal with the IRS. Some, like Pat Robertson and Billy Graham, register as religious organizations. They're exempt from most taxes but still must file disclosure reports showing how they make and spend their money.

Daystar and dozens of others call themselves churches, which enjoy the greatest protection and privacy of all nonprofit organizations in America.

Churches avoid not only taxes, but any requirement to disclose their finances. And, as NPR has learned, for the last five years churches have avoided virtually any scrutiny whatsoever from the federal government's tax authority.

Today, television evangelists are larger, more numerous, more complex, richer, with bigger audiences than ever before and yet they are the least transparent of all nonprofits.

The top three religious broadcasters — Christian Broadcast Network, Trinity Broadcasting Network and Daystar Television — are worth more than a quarter of a billion dollars combined, according to available records.

Tuesday, April 01, 2014

global system of conehead supremacy?


theunknownmoment |  Future Money Trends: "Who is calling the shots at the Vatican. I am assuming it's not the pope."  

Karen Hudes: "Well, there is something called the black pope but that's umm... that's not the ultimate reason why we have been in the fix that we are in. What we have found out, and this sounds implausible, but it's absolutely correct, the fact that its been held in secret doesn't mean that it's not true. It is true. There is a second species on this planet. They are not extraterrestrial, they are very much with us, they made maps in the previous ice age. The remnants of their civilisations are all over the place. A lot of times along the coast its submerged because the umm... amount, the sea level has gone up by 400 meters, but this group has large brains. They are very distinct from homo sapiens.

those big heads though...,

Fast forward to 25 minutes

wikipedia |  Judaism - In ancient Israel, the Kohen Gadol (High Priest) wore a headdress called the Mitznefet (Hebrew: מצנפ×Ŗ, often translated into English as "mitre"), which was wound around the head so as to form a broad, flat-topped turban. Attached to it was the Tzitz (Hebrew: ציׄ), a plate of solid gold bearing the inscription "Holiness to YHWH"[1] (Exodus 39:14, 39:30). Lesser priests wore a smaller, conical turban.

Byzantine empire - The camelaucum (Greek: ĪŗĪ±Ī¼Ī¹Ī»Ī±ĻĪŗĪ¹ĪæĪ½, kamilaukion), the headdress both the mitre and the Papal tiara stem from, was originally a cap used by officials of the Imperial Byzantine court. "The tiara [from which the mitre originates] probably developed from the Phrygian cap, or frigium, a conical cap worn in the Graeco-Roman world. In the 10th century the tiara was pictured on papal coins."[2] Other sources claim the tiara developed the other way around, from the mitre. In the late Empire it developed into the closed type of Imperial crown used by Byzantine Emperors (see illustration of Michael III, 842-867).

Worn by a bishop, the mitre is depicted for the first time in two miniatures of the beginning of the eleventh century. The first written mention of it is found in a Bull of Pope Leo IX in the year 1049. By 1150 the use had spread to bishops throughout the West; by the 14th century the tiara was decorated with three crowns.

dna analysis of paraca skulls unknown to any human, primate, or animal...,



PPP
collective-evolution |  Paracas is located in the Pisco Province in the Inca Region on the Southern coast of Peru. Home of the ground breaking discovery in 1928 by Julio Tello of a massive graveyard containing tombs filled with the remains of individuals with elongated skulls, now known as the famous Paracas Skulls.

They are approximately 3000 years old, and initial DNA analysis of them has revealed that they may not have come from humans, but from a completely new species, according to Paracas Museum assistant director, researcher and author Brien Foerster. Here is the apparent quote from the geneticist who did the testing

“Whatever the sample labeled 3A has came from – it had mtDNA with mutations unknown in any human, primate or animal known so far. The data are very sketchy though and a LOT of sequencing still needs to be done to recover the complete mtDNA sequence. But a few fragments I was able to sequence from this sample 3A indicate that if these mutations will hold we are dealing with a new human-like creature, very distant from Homo sapiens, Neanderthals and Denisovans. I am not sure it will even fit into the known evolutionary tree. The question is if they were so different, they could not interbreed with humans. Breeding within their small population. they may have degenerated due to inbreeding. That would explain buried children – they were either low or not viable” (Source)(Source)

It’s always been thought that the skulls were a result of cranial deformation, where the head is bound or flattened to achieve the shape. Many authors state that the time period to perform this shaping was approximately 6 months to 3 years, but the practice is no longer performed, which makes it hard to really know.  According to Forester:

From the doctors that I have spoken to, they have said that you can alter the shape of the skull but you cannot increase the size of the skull. The skull is genetically predetermined to have a certain volume.” (Source)(Source)

What he is saying is that you can change the shape of the skull, but not the actual volume of it, the shape, but not the size. This is why these skulls are such a mystery, because of their cranial volume, which in some cases is 2.5 times larger than a conventional human skull. Again, it’s well known that cranial deformation changes the shape of the skull, it’s been done by ancient cultures before by binding the head between two pieces of wood, or binding in cloth, but this does not change the volume and cause elongation like we see with the Paracas skulls.

“As I have said, deformation can alter shape, but not the volume of bone material, and certainly not twice as much. We are dealing with 2 different phenomena: elongation through binding, and elongation via genetics. The Paracas skulls are the largest found in the world, but from what root race stock would they have originated? To suggest that natural elongation was the result of hydrocephaly or some other clinical condition is ridiculous, when one takes into account that at least 90 of them were found in 1928.” (Source)

I would also like to quote author and historian Graham Hancock.

“I have grave doubts about stories presently doing the rounds on the internet, and apparently bought hook, line and sinker by many, making extravagant and premature claims about the implications of DNA testing on certain elongated skulls from Paracas in Peru. We have no details of the lab that’s done the testing, and even in the sensationalist reports that have been attracting so much attention it is emphasised that the findings are preliminary. Let’s wait until we see the findings themselves, rather than someone referring to them, and let’s get more detailed results, before we get in the least bit excited. That being said, previously unknown species of human have been coming out of the woodwork recently (Denisovans, Homo Floresiensis) so who knows? It’s always good to keep an open mind but right now I fear this whole thing with the Paracas skulls is going to blow up into a great discredit to alternative history. I do hope I am proved wrong.” (Source)

Foerster has raised thousands of dollars so far for the initial DNA testing, but a full genome study to completely verify the theory would cost at least one hundred thousand dollars.

Juan Navarro, the owner and director of the Paracas History Museum allowed the taking of samples from 5 skulls. The samples collected consisted of hair, tooth, skull and bone skin. Apparently, the process was documented via photos and video. The samples were given to the geneticist, who was not given any information about where they came from.

moses and akhenaton


grahamhancock | The Bible and the Kuran speak of Moses being born in Egypt, brought up in the pharaonic royal palace, and leading the Israelites in their Exodus to Canaan. In historical terms, when did Moses live, and who was the pharaoh of Oppression? Now that archaeologists have been able to uncover the mysteries of ancient history, we need to find answers to these questions. Egyptian born Ahmed Osman, believes that he has been able to find the answers for these questions which bewildered scholars for centuries. He claims that Moses of the Bible is no other than King Akhenaten who ruled Egypt for 17 years in the mid-14th century BC. 

During his reign, the Pharaoh Akhenaten was able to abolish the complex pantheon of the ancient Egyptian religion and replace it with a single God, Aten, who had no image or form. Seizing on the striking similarities between the religious vision of Akhenaten and the teachings of Moses, Sigmund Freud was the first to argue that Moses was in fact an Egyptian. Now Ahmed Osman, using recent archaeological discoveries and historical documents, contends that Akhenaten and Moses were one and the same person.
In a stunning retelling of the Exodus story, Osman details the events of Moses/Akhenaten’s life: how he was brought up by Israelite relatives, ruled Egypt for seventeen years, angered many of his subjects by replacing the traditional Egyptian pantheon with worship of Aten, and was forced to abdicate the throne. Retreating to exile in Sinai with his Egyptian and Israelite supporters, he died out of the sight of his followers, presumably at the hands of Seti I, after an unsuccessful attempt to regain his throne.
Osman reveals the Egyptian components in the monotheism preached by Moses as well as his use of Egyptian royal and Egyptian religious expression. He shows that even the Ten Commandments betray the direct influence of Spell 125 in the Egyptian Book of the Dead. Osman’s book, Moses and Akhenaten provides a radical challenge to the long-standing beliefs concerning the origin of Semitic religion and the puzzle of Akhenaten’s deviation from ancient Egyptian tradition. In fact, if Osman’s contentions are right, many major Old Testament figures would be of Egyptian origin.

CIA Showed The House Speaker Its Pictures Of His Little Johnson.....,

davidstockman  |   What Johnson’s impending Waterloo means, therefore, is not merely the prospect of another wild and wooly succession bat...