The Wind Rises is first to make 10 billion yen since 2008

Fans the world over were left a bit deflated the other week after the legend himself, Hayao Miyazaki, announced he was to retire from filmmaking and indeed the world of Studio Ghibli. The 72-year-old followed up the news with a press conference where he explained his decisions and answered some questions for an eager group of journalists.

Fan reception

And it would seem that his retirement hasn’t dampened the spirits of Ghibli’s fan base because his last ever film, ‘The Wind Rises’ — a film already out in cinemas in Japan — has seen last weekend’s box office receipts increase by a massive 34.5% on the previous. The news has clearly moved his audience but also seems to have enticed new viewers along to witness the talented Miyazaki at work on the big screen for one last time.

Essential viewing

Granted, even as a Ghibli fan, I’ve not seen many of his films projected at my local PictureHouse cinema, but I fully intend to pop down there once ‘The Wind Rises’ is released here in the UK. It’s been the recipient of some wonderful, positive reviews and is one for anime fans to check out. At present there is no word on a release date here yet and isn’t included in the programme for this year’s strong lineup for London Film Festival.

Record profits

Miyazaki’s final film has, so far, topped the 10 billion yen mark, which is the first Japanese film in its native country to do so since 2008’s ‘Ponyo’ — a film also directed and written by the maestro himself. The only other film to have passed this milestone in that time is Pixar’s ‘Toy Story 3’.

There’s clearly a lot more buzz around this film because of its significance as the end of an era for the man who’s been the face of Studio Ghibli for decades. What we can expect is for his son, Goro, to continue and build upon the superlative work his father’s achieved over the years as Studio Ghibli looks set to carry on and thrive as one of the most respected animation studios in the world.

The Voters Have Spoken: EA Is Your Worst Company In America For 2012!

Whether it’s on a console, a PC, a smartphone or tablet, hundreds of millions of people play video games every day. Yet most mainstream media covers the industry the same way it treats adult dodge ball leagues and cat fashion shows (both noble ventures, but neither of them multi-billion dollar industries). And the only time you hear legislators discuss video games is when some politician decries them as the death knell for all things righteous in the world (hint: they’re not). Now, after years of being ignored and relegated to steerage, game-players have voted to send a message to Electronic Arts and the gaming business as a whole: Stop treating your loyal customers like crap.

After more than 250,000 votes, Consumerist readers ultimately decided that the type of greed exhibited by EA, which is supposed to be making the world a more fun place, is worse than Bank of America’s avarice, which some would argue is the entire point of operating a bank.

To those who might sneer at something as “non-essential” as a video game company winning the Worst Company In America vote: It’s that exact kind of attitude that allows people to ignore the complaints as companies like EA to nickel and dime consumers to death.

For years, while movies and music became more affordable and publishers piled on bonus content — or multiple modes of delivery — as added value to entice customers to buy, video games have continued to be priced like premium goods.

There have even been numerous accusations that EA and its ilk deliberately hold back game content with the sole intent of charging a fee for it at a later date. It’s one thing to support a game with new content that is worth the price. It’s another to put out an inferior — and occasionally broken — product with the mindset of “ah, we’ll fix it later and make some money for doing so.”

New, independent game companies do pop up all the time, but the cost of entering the market has historically been too expensive, making these indie innovators prime targets for acquisition by mega-publishers like EA. Our hope is that the growth of app-based gaming and downloadable games will continue to make it easier for developers to get their products out without the backing of companies that don’t care a lick about the people who fork over their cash.

Oh well, Worst Company In America 2012 is officially in the books. All that’s left to do is send off the Golden Poo to EA.

Traditionally, the Poo has been delivered on its little red pillow. But this year, we’ll give EA three different color options for its pillow, though in the end it’s still the same old Poo.

Thanks again to everyone who voted. See you all again in about 49 weeks!

Against Authoritarianism, For Democracy: Secular Humanism’s Crusades Against Civil Inequality in India

India is no stranger to irreligion and humanism, and never genuinely has been. Through its vibrant history, the nation has been host to a plethora of notable freethinkers and skeptics, and has served as an enriched mausoleum of humanistic thinking and culture. In spite of its dismally low number of atheists, Indian skeptics such as author Salman Rushdie, filmmaker Satyajit Ray, astrophysicist Subramanyan Chandrasekhar, and Prime Minister Jawaharlal Nehru are well-known not only in the Indian subcontinent but across the globe as well. No slackers on that list!

But the civil state of India has been ravaged by systematic discrimination and division into class systems that disparage human equality, promulgated by religious traditions and beliefs in the caste system (a key component of traditional Hinduism). This has manifested itself as the maintenance of the “untouchables” (or dalit, as is widely known), which is a level on the caste system that is regarded to be “below” others. In fact, here’s a gripping description of their plight, from National Geographic:

Untouchables are outcasts—people considered too impure, too polluted, to rank as worthy beings. Prejudice defines their lives, particularly in the rural areas, where nearly three-quarters of India’s people live. Untouchables are shunned, insulted, banned from temples and higher caste homes, made to eat and drink from separate utensils in public places, and, in extreme but not uncommon cases, are raped, burned, lynched, and gunned down.

The ancient belief system that created the Untouchables overpowers modern law. While India’s constitution forbids caste discrimination and specifically abolishes Untouchability, Hinduism, the religion of 80 percent of India’s population, governs daily life with its hierarchies and rigid social codes. Under its strictures, an Untouchable parent gives birth to an Untouchable child, condemned as unclean from the first breath.

In fact, dalit in Sanskrit means “suppressed” or “broken.”

“It’s like you are born with a stamp on your forehead and you can never get rid of it,” says Amit, one of the community correspondents and an activist for dalit rights.

Also plaguing India is superstition deeply embedded in its religious culture. Fear of ghosts, demons, haunted houses, and otherworldly specters runs rampant. Prevailing paradigms dictated that pregnant women should not view solar eclipses (with no scientific reasons whatsoever to affirm). Sustained and fueled by fear and misunderstanding of the unknown, this superstitious nature suffused in Indian culture has yet to be decimated.

Humanism Overturning Prejudice and Superstition

Near the forefront at the battle for combating the forces of prejudice, irrationality, and the awful, derogatory caste system is a fresh wave of secular humanism, spearheaded by two late heroes: Goparaju Rao (nicknamed in India as Gora) and Saraswathi Gora. The husband and wife served and died as icons of social progress and harbingers of an age of reason–an age that deemed India capable of relieving itself of past misapprehended beliefs lacking logical basis, and firmly rooting in its place respect for science and critical evaluation. After he was forced to resign from his profession due to expression his atheism, Gora devoted himself to dispelling the superstition and irrationality institutionalized and revered in his nation; he and his spouse established the Atheist Center, a home for the expression and discourse of thoughts pertaining to religion and disbelief. The Atheist Center was awarded the International Humanist Award in 1986.

Together, the couple crusaded without relent against the strain of anti-intellectualism that had rooted itself in the culture of India, surreptitiously poisoning the irrational beliefs and practices of the masses beyond the point of even recognizing the baselessness of the notions held. In fact, he and his wife publicly viewed solar eclipses, as there was a superstitious belief that pregnant women should not do so. They resided in haunted houses to dismiss the myths about such places. Every full moon night marked the periodical “cosmopolitan dinner”, hosted by Gora, in which Indians of all religions and castes were warmly invited to dine together, signifying the harmonic ideals they strove to achieve. As an act of dissent from the government, the couple additionally organized “beef and pork” parties, shattering proposed laws prohibiting such items due to religious reasons. Always the contrarians, this couple rose to fame as valiant social reformers working to unburden India of its baseless faith and superstition.

Gora and Saraswathi viewed the caste system with abhorrence. They regularly conducted inter-caste and inter-religious marriages to dispel prejudices amongst civil groups. In fact, their own daughter was wed to a dalit, or “untouchable”.

From The Hans India:

The atheist movement in South India is a fertile ground for the onward march of atheism and rationalism. It was a citadel of the freedom struggle and social reform movements, challenging colonialism, imperialism and the oppressive Nizam rule.

It fought against caste system and untouchability with democratic approach and egalitarian value system. It champions castelessness and strives for a post-religious society. It became a role model to the world in its commitment.

Gora engaged in a series of profound and thoughtful conversations with the well-known Mohandas Gandhi. Their conversations have been recorded in a book titled An Atheist With Gandhi. This collection of excerpts from discourse between the two, has often been described as “brief” but “eliciting wonderful thoughts and analysis.”

The Wonderfulness of Thought

In a nation suffused with a culture that espouses irrationality, that sings hymns to otherworldly spirits no one has put forth a morsel of evidence for, that prizes blind submission over critical discourse and dissent, that is–simply put–superstitious and proud of it, it takes quite some valor to take a stand and challenge the roots of the dogma that is draped over the minds of your peers, your parents, and your society.

Saraswathi Gora and Goparaju Rao. A couple shaped in the flames of irrationality gripping India, searing and decimating the grains of critical thought in favor of silence and faith. A couple that managed to not only escape from these flames, but to strive to smother the kindle of injustice and unreason. Consistently an adversary of authoritarian illiberalism and unscientific convictions, this pair of social reformers are not to be trifed with when it comes to matters of social progress and the advancement of logical thinking.

I rightfully suppose I speak for all humanists when I applaud the paramount and rousing bravery demonstrated by the paragon of brilliance and intellectual virtue that is Saraswathi Gora and Goparaju Rao, working to combat the bastion of unreason in south India.

Why Are Bitcoiners Going to Jail for Money Laundering While Big Banks Walk?

BitInstant CEO Charlie Shrem, along with alleged co-conspirator Robert Faiella, was arrested by federal authorities last week for allegedly laundering more than $1 million worth of Bitcoins. This is a tiny amount compared to the largest drug-and-terrorism money laundering case ever. Yet when British bank HSBC was found guilty in 2012 of laundering billions, the firm paid a fine of $1.9 billion. Authorities made no arrests, and HSBC still turned a $13.5 billion profit that year.

Rolling Stone’s Matt Taibbi detailed the crimes HSBC helped fund, including “tens of thousands of murders” and laundering money for Al Qaeda and Hezbollah. By contrast, Silk Road users have only been shown to have bought and sold drugs. (The six murders-for-hire commissioned by alleged former Silk Road head Ross Ulbricht were never carried out.) In fact, by moving transactions online, Silk Road likely decreased the violence associated with the drug trade.

Again, no individual associated with HSBC paid any money or spent a day in jail. Shrem is currently in custody. Why is there such a disparity? Clearly the size, scope, violence or effect of the crime can’t justify the discrepancy in response. The Justice Department explained it by saying that HSBC is, in essence, Too Big to Jail.

“Had the U.S. authorities decided to press criminal charges,” said Assistant Attorney General Lanny Breuer during the announcement of the HSBC settlement. “HSBC would almost certainly have lost its banking license in the U.S., the future of the institution would have been under threat and the entire banking system would have been destabilized.”

What are the moral and practical underpinnings of a law whose punishments are harshest for those who violate it least? The Justice Department is here admitting that the costs of fairly enforcing the money laundering laws as written — potentially shaking up a major bank — outweigh the benefits.

Charlie Shrem founded BitInstant, a service that let users quickly buy into Bitcoin, in his family’s garage with $10,000 of their money while still in college. He became a founding member of the Bitcoin Foundation and served as vice chairman of the board — a position he’s since stepped down from. But his startup was plagued by regulatory troubles from the start, fueled by a lack of clear regulation pertaining to Bitcoin. In a trailer for the Bitcoin documentary “The Rise and Rise of Bitcoin,” Shrem claims to spend “thousands of dollars on lawyers every day just to make sure that I’m not gonna go to jail.”

While Shrem operated in regulatory uncertainty, and didn’t know the charges against him on the day he was arrested, HSBC received 30 different formal warnings in just one brief stretch between 2005 and 2006. Even then, HSBC was openly flouting the rules. The bankers knew, for instance, that they were funneling money for people such as one of 20 early financiers of Al Qaeda, a member of what Osama bin Laden himself apparently called the “Golden Chain,” according to Taibbi. Another customer was powerful Syrian businessman Rami Makhlouf, a close confidant of the Assad family.

Some, including Taibbi, have called for equal jail time for all offenders. There is no doubt that the Justice Department is overstating the lasting, worldwide effect of justly applying money laundering laws. However, the drawbacks to enforcing laws against what’s estimated to make up a third of all transactions are very real. In this reality, it’s legitimate to ask whether laws against money laundering should be applied to anyone.

Money laundering is, simply, the process of concealing sources of money. While the standard image of money laundering involves murders, Mexican narco-gangs, and Al Qaeda, in reality there are many reasons that normal people would want to keep their transactions anonymous — which is a big reason why Bitcoin gained popularity with libertarians in the first place.

As J. Orlin Grabbe wrote, “Anyone who has studied the evolution of money-laundering statutes in the U.S. and elsewhere will realize that the ‘crime’ of money laundering boils down to a single, basic prohibited act: Doing something and not telling the government about it.” Criminalizing this means that by default government has the right to know the source of all of every citizen’s money. In some jurisdictions, money laundering can be just using financial systems or services that do not identify or track sources or destinations.

Writing in American Banker, Bitcoin advocate Jon Matonis explains that “from President Roosevelt’s 1933 seizure of personal gold to the Nazi confiscation of Jewish wealth to the recent deposit theft at Cyprus banks, asset plundering by governments has a long and colorful tradition. Protecting wealth from oppressive regimes continues to this day.”

In that piece, Matonis calls money laundering the thoughtcrime of finance, a sentiment that’s gained traction in libertarian circles. Hiding or failing to report where money comes from is, in and of itself, a victimless crime. The theory is that everyone owes it to the government to make enforcing laws against violent crime easier. But that’s not really accurate, as most of the laundered money is used in other crimes whose violence stems from their being illegal, such as gambling and the drug trade.

This reporting to the government comes at a significant cost, both in terms of resources and privacy.

The Economist has estimated the annual costs of anti-money laundering efforts in Europe and North America to be in the billions. Even its most legitimate function, trying to keep people from financing terror, has been deemed a costly failure by the magazine. Curiously enough, the Economist concedes that efforts to reduce identity theft and credit card fraud are most effective at combating money laundering.

Perhaps this cost could be justified if the information gathered through the reporting requirements was used to cut off funds to terrorists. But as the HSBC case shows, that’s not the case. After getting notice after notice about failing to properly report on its customers, HSBC simply hired former call center employees to “investigate” cases of money laundering to unsavory characters. And when one employee actually did, he was fired.

Besides costing billions, efforts to stamp out money laundering also erode privacy. Ensuring every transaction is above board forces banks to be cops through so-called “know your customer” laws. These laws essentially conscript private businesses “into agents of the surveillance state,”according to the American Civil Liberties Union. Bitcoin offers an interesting counterpoint: Even if accounts may be anonymous, transactions are all public, which is a level of transparency not seen in the fiat currency system.

So why does alleged Bitcoin laundering deserve jail time? On a pure cost-to-benefit basis, perhaps it makes sense to jail Shrem while giving HSBC executives a slap-on-the-wrist fine. Revoking the bank’s license would rock the entire financial system, while Shrem’s enterprise was already on hold at the time of his arrest.

However, laws which result in jail time for minor infractions while the worst offenders walk free deserve their own cost/benefit analysis. If money laundering laws are worth their cost to companies, to the government, and to privacy, surely they are worth applying fairly and evenly. If not, perhaps its time to rethink whether they make sense at all.

Matt Damon vs. Sarah Palin and the Dinosaurs

On September 10, 2008, a video of actor Matt Damon was released to the press that quickly got posted on YouTube, and has now gotten over two million hits. Here’s what he said:

I think there’s a really good chance that Sarah Palin could be president, and I think that’s a really scary thing because I don’t know anything about her. I don’t think in eight weeks I’m gonna know anything about her. I know that she was a mayor of a really, really small town, and she’s governor of Alaska for less than two years. I just don’t understand. I think the pick was made for political purposes, but in terms of governance, it’s a disaster. You do the actuary tables, you know, there’s a one out of three chance, if not more, that McCain doesn’t survive his first term, and it’ll be President Palin. And it really, you know, I was talking about it earlier, it’s like a really bad Disney movie, you know, the hockey mom, you know, “I’m just a hockey mom from Alaska”—and she’s the president. And it’s like she’s facing down Vladimir Putin and, you know, using the folksy stuff she learned at the hockey rink, you know, it’s just absurd. It’s totally absurd, and I don’t understand why more people aren’t talking about how absurd it is. I … it’s a really terrifying possibility.

The fact that we’ve gotten this far and we’re that close to this being a reality is crazy. Crazy. I mean, did she really—I need to know if she really thinks dinosaurs were here 4,000 years ago. That’s an important … I want to know that. I really do. Because she’s gonna have the nuclear codes, you know. I wanna know if she thinks dinosaurs were here 4,000 years ago or if she banned books or tried to ban books. I mean, you know, we can’t have that.

Before I was a Christian, I could have easily said the same kind of things as Matt Damon. In fact, I probably did say similar things, often—and especially to Christians. Like: “How can these people believe in things like, ‘in the beginning God,’ Adam and Eve, Noah’s ark and that man walked around with dinosaurs a few thousand years ago?” I can still hear myself mocking Christianity as I type on my laptop.

But the truth is, Matt Damon, like so many people (and like me) was probably never given a chance to hear the case for Christianity. A sizable cross-section of our country hears only the case against Christianity—presented in nearly every classroom across America, from grammar school to university. Matt Damon has the added benefit of his “education” within the Hollywood elite.

Generally speaking, Matt is opposed to anyone who is not a liberal. Therefore, he’s opposed to anyone opposed to Obama. But he claims to oppose Sarah Palin due to her lack of experience and her young earth views. I bet he’d be bothered by me having the nuclear codes as well. I, too, believe the earth is thousands—not millions—of years old. But I don’t believe many Christians today believe in a strict 4,000-year age of the earth. That number is rooted in something known as Ussher’s chronology from the 17th century, in which James Ussher argued the date of creation was October 23, 4004 B.C. It has now been discredited.

Let me say that most of the Christian leaders and pastors I know are “young earthers” who also believe the earth is thousands, not millions, of years old. There are a number of “old earthers” that I still respect. I just disagree with them. What I especially oppose are the improper motives of people on both sides who say, essentially, “It’s got to be my way or it’s heresy.” I know young earthers who say, “If you believe the earth is more than 10,000 years old, then you’re not taking the Bible literally, you probably don’t believe in inerrancy and you might not even really be saved.” I’ve met old earthers who say, “If you don’t believe the Earth is billions of years old, then you’re rejecting God’s second book—that of general revelation—and since you’re unwilling to synthesize your beliefs about theology with the facts of science, then you’re the Christian equivalent of a flat earther or a Holocaust denier.

There are good cases to be made for both sides. And, based upon evidence and argument, I could be persuaded out of my young earth position. In my view nothing essential rests upon it. I can believe in a young man on and old earth thesis (as does Hugh Ross), and have no problem with it biblically or scientifically. I’m just not there.

Let’s face it, the impetus for the “old man, old earth” thesis is the time necessary for man to evolve from an ape. But, if you believe God created man a few thousand years ago, then all the skull and tooth fragments from any alleged “missing link” (that’s still missing) will never prove modern man is the descendant of apes. It can only prove similar design, not common descent.

The biggest problem for my young earth view is the age of light and, in critical terms, what I call the level of “the divine deception.” Ask a Christian junior high group if God can make trees out of nothing and you’ll get a quick agreement. Ask them to suppose that God creates a giant sequoia right in front of them right now. Ask if the tree would have bark, limbs, roots, and leaves. Again, a hearty yes. Then ask about whether there would be tree rings inside and some of the kids drop off. Why? Because tree rings mean time and time can’t have transpired if He just created the tree in front of you. Ask those who accept the tree rings why they think so, and you’ll hear, “Because God can create with the appearance of age.”

In fact, there’s almost nothing you can think of that doesn’t have the “tree ring” element to it. To create something means you must give it an age—and it always could have been younger. Even light. I’m no astrophysicist, but I read that light itself has elements of age to it and that distant starlight has these elements of decay or age. I used to say about distant starlight that “God had created the light in transit”—and I was intellectually satisfied, with no reservations. Now, however, I learn that the divine deception extends to the atomic level; the light is “aged” at the level of infinitesimal minutia. For many, that is a level of divine deception too great for their plausibility level. They’d prefer to believe there is no “deception”—that the universe is billions of years old (the earth too) and perhaps that man is millions of years old rather than thousands.

I, on the other hand, prefer to think of it not as deception, but as consistency. Adam could have been created a toddler, those trees in the Garden could have been seeds and those animals could have been newborns. But none of this would have served God’s perfect will and purposes. He was concerned with the “Who” not the “how.” The repeated refrain in Genesis 1—“and God said”—doesn’t give much detail on the how. God spoke. At the end of His creative work, God looked at it all and saw that “it was very good.” When you have infinite material and infinite energy, the “how” isn’t really an issue.

For me, once I was persuaded there was a Creator, the “how” of the creation became a matter of reduced importance. I wonder if Matt Damon’s God could have created man and the dinosaurs and allowed them to be contemporaneous. If not, why not?

More than likely, Matt doesn’t believe in God at all. In that case, he’s not really against Sarah Palin—or even Christians. He’s against the Creator himself.

Are diamonds really rare? Myths and misconceptions about diamonds

Diamonds are our most popular gemstone. That hasn’t always been the case. It was only in the last century that diamonds became readily available. Prior to that, rubies and sapphires were the most popular gems, especially for engagement rings.

Kimberly Diamond Mine, South Africa. creative commons licensed ( BY-SA ) flickr photo shared by string_bass_dave

The popularity of diamonds is due primarily to the DeBeers organization. They set up the first large-scale diamond mines in South Africa. Then they began one of the most successful advertising campaigns in history, convincing consumers that engagement rings should always have a diamond.

With proper encouragement, the movie industry displayed its most glamorous women, draped in diamonds. As a result, diamonds soon became a top status symbol for the rich and famous. This peaked with the Marilyn Monroe movie, Diamonds are a Girl’s Best Friend.

Even after winning the consumers’ admiration, DeBeers continued their advertising. With the discovery of diamonds in the Soviet Union, a new campaign was created to sell anniversary bands. These made good use of the small, but nice quality diamonds the Soviet Union produced.

While the DeBeers company did wonderful things for the diamond industry, not everything about DeBeers is nice. As diamonds were discovered in other parts of Africa and South America, DeBeers managed to get control of the rough diamond supply. The tactics used to gain control of these rough diamond supplies are alleged to include murder and kidnapping.

DeBeers maintained monopolistic control over the diamond market for several decades. They carefully released only enough rough diamonds to satisfy then-current demand, while continually adjusting the degree to which the rough diamonds were made available. This caused continually escalating prices, and of course, it increased the perception of rarity. DeBeers actually mined considerably more rough diamonds than they sold and they have a large warehouse of uncut diamonds in London. As a result, they were not allowed to do business in the US, Canada, and a few other countries.

In the last couple decades of the 20th century, things began to change. Satellite technology, that was designed to find likely oil reserves, also showed the geology likely to hold diamonds. As a result, new discoveries began to multiply. Australia was one of the first developed nations to discover major diamond resources. DeBeers was able to make a deal with them to distribute all the rough, except for the very rare pink diamonds.

They also made a deal with the Soviet Union to distribute their rough diamonds. However, shortly after the break up, the Russians let their contract expire and began to sell the diamonds themselves.

The latest major diamond reserve was found in Canada. DeBeers could not make a deal with the Canadians, who are cutting and selling the stones themselves.

It is difficult to tell what the future will hold. Several sites are being explored and it is likely more diamond deposits will be found in the near future. DeBeers still controls approximately 75% to 80% of the diamond rough. The other suppliers have so far been content to sell at the same prices as DeBeers. However, if the law of supply and demand ever catches up to the diamond market, prices are likely to drop considerably. It is difficult to tell how this would play out, but DeBeers has a large inventory of uncut diamonds and would be in an excellent position for a price war.

Myths and Misconceptions

Here are some popular myths that you need to be aware of.

MYTH: Diamonds are rare.

Diamonds are the hardest material found on earth. Other than that, they hold no unique distinctions. All gem grade materials are rare, composing just a tiny fraction of the earth. However, among gems, diamonds are actually the most common. If you doubt this ask yourself; “How many women do you know that do not own at least one diamond?” Now ask the same question about other gems.

In the constellation Centaurus, there lies a white dwarf, that has crystallized into a diamond 2,500 miles in diameter and weighing 10 billion, trillion, trillion carats.

While we are still learning about the interior of the earth, current information shows that diamonds are likely the most common gem in nature. (See Gem Formation.)

Outside the earth, diamonds are also common. A recent discovery shows that some stars collapse on themselves, creating giant diamond crystals. In the constellation Centaurus, there lies a white dwarf, that has crystallized into a diamond 2,500 miles in diameter and weighing 10 billion, trillion, trillion carats.

MYTH: Diamonds are the most valuable gem.

You cannot say that one species of a gem is the most valuable. To do a comparison, you need to judge gems according to size and quality. This chart is based on top quality gems in different sizes. However, note that pure red rubies are so rare there is no trade data available. The prices listed are for Burmese rubies.

As you can see, diamonds are very costly, but not the most expensive gem in any size. If you were to do a comparison of other qualities, the results would be similar. If you are looking to invest in gems you should read our article on the inner workings of the gem trade.

MYTH: Diamonds are precious.

Precious means valuable. In the 18th century, a French jeweler began describing gems as either precious or semiprecious. The categories are still used in merchandising but are frowned upon by professionals as they are nearly meaningless distinctions.

For example, garnets are considered semiprecious, but tsavorite garnets have sold for as much as $10,000 per carat. That seems pretty “precious” to me!

On the other hand, diamonds are only very valuable in their better grades and medium to large sizes. Small, low-quality diamonds are available in quantity for just $1 a piece. A quick search of eBay and you will find several diamonds under $20. These are far from precious.

MYTH: Diamonds are the most brilliant gemstone.

Brilliance is determined by the cutting and the refractive index of the material. Diamonds have a very high refractive index of 2.41. Diamonds have the potential if properly cut to be exceptionally brilliant. However, this is nothing compared to the 2.9 RI of rutile. Not counting synthetics, there are at least 15 minerals with a higher refractive index than diamond!

MYTH: A person can make a lot of money selling diamonds.

As the Internet has continued to proliferate and GIA has established well-accepted grading standards for diamonds, margins on cut diamonds have become extremely thin. It is not uncommon for a diamond dealer to make gross margins inside of 5%. Compare that to virtually any other industry and you won’t think it is such a great business.

MYTH: Diamonds have more “fire” than any other gemstone.

Diamonds are known for their fire or dispersion. This is the ability to separate white light into the color of the rainbow. Diamond has a dispersion of .044, which is quite high. However, it is a far cry from gems like rutile with a dispersion of .330!

What is a Diamond?

Diamonds are a natural mineral and they are also produced in the laboratory. Lab made diamonds are primarily used as abrasives, but they are beginning to make their way into the jewelry industry. (See Understanding Gem Synthetics and Identification of Synthetic Diamonds.)

Gemologically speaking, diamonds are a mineral with a chemical composition of C, (carbon,) that crystallize in the isometric system. (See How Gems Are Classified.)

With a hardness of 10, diamonds are the hardest substance in nature. Harder substances have been created in the laboratory, but they are extremely brittle and have no practical use. If a harder substance is ever found that does not break down so quickly, it will greatly reduce the time needed to cut diamonds.

Diamonds have a refractive index of 2.41, which is very high. Being as they form in the isometric system, they do not have any birefringence or pleochroism. They have a specific gravity of 3.51 to 3.53, which is a bit more than average.

86 Deceased NFL Players Test Positive for Brain Disease

A total of 87 out of 91 former NFL players have tested positive for the brain disease at the center of the debate over concussions in football, according to new figures from the nation’s largest brain bank focused on the study of traumatic head injury.

Researchers with the Department of Veterans Affairs and Boston University have now identified the degenerative disease known as chronic traumatic encephalopathy, or CTE, in 96 percent of NFL players that they’ve examined and in 79 percent of all football players. The disease is widely believed to stem from repetitive trauma to the head, and can lead to conditions such as memory loss, depression and dementia.

In total, the lab has found CTE in the brain tissue in 131 out of 165 individuals who, before their deaths, played football either professionally, semi-professionally, in college or in high school.

Forty percent of those who tested positive were the offensive and defensive linemen who come into contact with one another on every play of a game, according to numbers shared by the brain bank with FRONTLINE. That finding supports past research suggesting that it’s the repeat, more minor head trauma that occurs regularly in football that may pose the greatest risk to players, as opposed to just the sometimes violent collisions that cause concussions.

But the figures come with several important caveats, as testing for the disease can be an imperfect process. Brain scans have been used to identify signs of CTE in living players, but the disease can only be definitively identified posthumously. As such, many of the players who have donated their brains for testing suspected that they had the disease while still alive, leaving researchers with a skewed population to work with.

Even with those caveats, the latest numbers are “remarkably consistent” with past research from the center suggesting a link between football and long-term brain disease, said Dr. Ann McKee, the facility’s director and chief of neuropathology at the VA Boston Healthcare System.

“People think that we’re blowing this out of proportion, that this is a very rare disease and that we’re sensationalizing it,” said McKee, who runs the lab as part of a collaboration between the VA and BU. “My response is that where I sit, this is a very real disease. We have had no problem identifying it in hundreds of players.”

In a statement, a spokesman for the NFL said, “We are dedicated to making football safer and continue to take steps to protect players, including rule changes, advanced sideline technology, and expanded medical resources. We continue to make significant investments in independent research through our gifts to Boston University, the [National Institutes of Health] and other efforts to accelerate the science and understanding of these issues.”

The latest update from the brain bank, which in 2010 received a $1 million research grant from the NFL, comes at a time when the league is able to boast measurable progress in reducing head injuries. In its 2015 Health & Safety Report, the NFL said that concussions in regular season games fell 35 percent over the past two seasons, from 173 in 2012 to 112 last season. A separate analysis by FRONTLINE that factors in concussions reported by teams during the preseason and the playoffs shows a smaller decrease of 28 percent.

Off the field, the league has revised safety rules to minimize head-to-head hits, and invested millions into research. In April, it also won final approval for a potential $1 billion settlement with roughly 5,000 former players who have sued it over past head injuries.

Still, at the start of a new season of play, the NFL once again finds itself grappling to turn the page on the central argument in the class-action lawsuit: that for years it sought to conceal a link between football and long-term brain disease.

The latest challenge to that effort came two weeks ago with the trailer for a forthcoming Hollywood film about the neuropathologist who first discovered CTE. When the trailer was released, it quickly went viral, leaving the NFL bracing for a new round of scrutiny over past efforts to deny any such connection.

The film, Concussion, starring Will Smith, traces the story of Bennet Omalu, who in 2005 shocked the football establishment with an article in the journal Neurosurgery detailing his discovery of CTE in the brain of former Pittsburgh Steelers center Mike Webster. At the VA lab and elsewhere, CTE has since been found in players such as Hall of Famer Junior Seau, former NFL Man of the Year Dave Duerson, and Colts tight end John Mackey, a past head of the player’s union.

While the story is not a new one, for the NFL, it represents a high-profile and potentially embarrassing cinematic interpretation of a period in which the league sought to refute research suggesting football may contribute to brain disease.

From 2003 to 2009, for example, the NFL’s now disbanded Mild Traumatic Brain Injury Committee concluded in a series of scientific papers that “no NFL player” had experienced chronic brain damage from repeat concussions, and that “Professional football players do not sustain frequent repetitive blows to the brain on a regular basis.”

In the case of Omalu, league doctors publicly assailed his research, and in a rare move, demanded a retraction of his study. When Omalu spoke to FRONTLINE about the incident for the 2013 documentary, League of Denial: The NFL’s Concussion Crisis, he said, “You can’t go against the NFL. They’ll squash you.”

In a conversation with FRONTLINE, McKee said that her biggest challenge remains “convincing people this is an actual disease.” Whatever pockets of resistance still exist, she said, have primarily come from those with a “vested interest” in football.

“People want to make this just Alzheimer’s disease or aging and not really a disease,” according to McKee. “I think there’s fewer of those people, but that’s still one of our major hurdles.”

Business Health Lifestyle Legal Activism Science Reviews Smoking Marijuana Causes ‘Complete Remission’ of Crohn’s Disease, No Side Effects, New Study Shows

Marijuana – scientific name “cannabis” – performed like a champ in the first-ever placebo-controlled trial of the drug to treat Crohn’s Disease, also known as inflammatory bowel disease.

The disease of the digestive tract afflicts 400,000 – 600,000 people in North America alone causing abdominal pain, diarrhea (which can be bloody), severe vomiting, weight loss, as well as secondary skin rashes, arthritis, inflammation of the eye, tiredness, and lack of concentration.

Smoking pot caused a “complete remission” of Crohn’s disease compared to placebo in half the patients who lit up for eight weeks, according to clinical trial data to be published the journal Clinical Gastroenterology and Hepatology.

Researchers at Israel’s Meir Medical Center took 21 people with intractable, severe Crohn’s disease and gave 11 of them two joints a day for eight weeks. “The standardized cannabis cigarettes” contained 23 percent THC and 0.5 percent CBD (cannabidiol). (Such marijuana is available on dispensary shelves in San Francisco, Oakland, and other cities that have regulated access to the drug.) The other ten subjects smoked placebo cigarettes containing no active cannabinoids.

Investigators reported that smoking weed caused a “complete remission” of Crohn’s Disease in five of the 11 subjects. Another five of the eleven test subjects saw their Crohn’s Disease symptoms cut in half. Furthermore, “subjects receiving cannabis reported improved appetite and sleep, with no significant side effects.”

The study is the first placebo-controlled clinical trial to assess the consumption of cannabis for the treatment of Crohn’s, notes NORML. All of the patients had intractable forms of the disease and did not respond to conventional treatments. Still, the United States government claims that marijuana is as dangerous as heroin and has no medical use. U.S. Attorney Melinda Haag is waging a war on safe access to medical cannabis in the Bay Area.

Judge Says 10 Rare Gold Coins Worth $80 Million Belong to Uncle Sam

A judge ruled that 10 rare gold coins worth $80 million belonged to the U.S. government, not a family that had sued the U.S. Treasury, saying it had illegally seized them.

The 1933 Saint-Gaudens double eagle coin was originally valued at $20, but one owned by King Farouk of Egypt sold for as much as $7.5 million at a Sotheby’s auction in 2002, according to Courthouse News.

After the U.S. abandoned the gold standard, most of the 445,500 double eagles that the Philadelphia Mint had struck were melted into gold bars.

However, a Philadelphia Mint cashier had managed to give or sell some of them to a local coin dealer, Israel Switt.

In 2003, Switt’s family, his daughter, Joan Langbord, and two grandsons, drilled opened a safety deposit box that had belonged to him and found the 10 coins.

When the Langbords gave the coins to the Philadelphia Mint for authentication, the government seized them without compensating the family.

The Langbords sued, saying the coins belonged to them.

In 2011, a jury decided that the coins belonged to the government, but the family appealed.

Last week, Judge Legrome Davis of the Eastern District Court of Pennsylvania, affirmed that decision, saying “the coins in question were not lawfully removed from the United States Mint.”

Barry Berke, an attorney for the Langbords, told, “This is a case that raises many novel legal questions, including the limits on the government’s power to confiscate property. The Langbord family will be filing an appeal and looks forward to addressing these important issues before the 3rd Circuit.”

The family said in its suit that in another seizure of the 1933 double eagle, the government split the proceeds with the owner after the coin sold for $7.59 million in 2002.

regarding the difference between embracing and exploiting geek culture

I’ve gotten a ton of criticism from people about the I Am a Geek video that launched yesterday, and I feel the need to respond to it.

After watching the video yesterday, I was impressed by the production values, and I thought it was really awesome that it was just one small part of a larger project. I love that the whole thing is supposed to encourage literacy (if you really look for the links) and intends to support a good cause. As a writer, I certainly want more people to be readers!

But as I watched it a second and a third time, something didn’t feel quite right to me. I couldn’t put my finger on it, until e-mail started flooding in from people who could: this was supposed to be about refuting stereotypes and celebrating the things we love, but it ends up feeling like we’re trying to convince the Cool Kids that we’re really just like them, and a promotional opportunity for celebrities who don’t know a damn thing about our geek culture, and don’t care about the people who create and live in it.

I was under the impression that this video would feature actual geeks who are important to our culture, like Woz, Felicia Day, Leo Laporte, and Jonathan Coulton. Instead, I saw a lot of entrepreneurs who have good marketing instincts, joined by a bunch of celebrities who are attempting to co-opt our culture because it’s what their publicity team is telling them to do.

When you’re speaking to people who read TMZ and People magazine, getting contributions from MC Hammer, Ashton Kutcher and Shaq is a logical choice. But when you’re speaking to geeks, it’s insulting to us to pretend that they are part of and speak for our culture. Those people are not geeks; they’re celebrities who happen to use Twitter. Featuring them as “geeks” undermines the whole effort, because they aren’t like us. I’ve been a geek my whole life. I’ve suffered for it, I’ve struggled because of it, and I’ve worked incredibly hard to remove the social stigma associated with all these things we love, like gaming and programming. It’s like a slap in the face to be associated with these people who claim to be like me, and want to be part of our culture, but couldn’t tell you the difference between Slackware and Debian, a d8 and a d10, or how to use vi or emacs. In other words, they haven’t earned it, but they’re wrapping themselves in our flag because their PR people told them to.

Having someone in a video that purports to celebrate our geek culture say that they don’t play D&D, like playing an RPG is something to be ashamed of, is profoundly offensive to me, because I play D&D. In fact, it’s the chief reason I am a geek. D&D isn’t anything to be ashamed of, it’s awesome. I don’t recall seeing that in the script I was given, and if I had, I never would have agreed to be part of this project.

I loved the idea of creating a video that celebrates our culture and shows that we’re proud to be in it. That’s what I thought this would be, but I feel like we ended up with some kind of self-promoting internet marketing thing that plays right into established stereotypes, and hopes that The Cool Kids will let us hang out with them.

I am a geek. I have been all my life, and I know that those guys are nothing like me and my friends. If we’re going to celebrate and embrace geek culture, we should have geeks leading the effort, not popular kids who are pretending to be geeks because it’s the easy way to get attention during the current 15 minute window.

I want to be clear: I wasn’t misled, I think that the project just changed from conception to release. I think their heart was in the right place, and I think their fundamental idea was awesome. But what I saw isn’t what I thought I was going to be part of. I thought I was going to be part of something that said, “Hey, I am a geek, I’m proud of that, and if you’re a geek you should be proud of it too!” What I saw was more like, “I am using new media to reach people. Yay!” There’s nothing wrong with that, but it doesn’t mean the people doing it are geeks, and it’s not what I thought I was contributing to.

There was a great conterpoint on Twitter just now, while I was wrapping this up. Wyldfire42 said: “Seems to me that we shouldn’t be deciding who is or isn’t a geek. If we start passing judgment, we just become the bullies we hated.” I can’t disagree with that, at all, and after reading that, I feel a little grognard-y. Who knows, maybe these celebrities who have recently shown up in our world love these things as much as we do. Maybe it’s not their fault that they bring hordes of celebrity-obsessed non-geeks with them wherever they go. Maybe they’re as upset about people telling them they’re not “real” geeks as I am about marketers pretending that they are.

Maybe I’m overreacting, but I care deeply about my fellow geeks and there is a fundamental difference between embracing our culture and exploiting it. Please, come and be part of our culture. Read our books and play our games and watch our movies and argue with us about what is and isn’t canon. But if you try to grab our dice, and then don’t even know or care why we’re a little touchy about it … well I cast Magic Missile on you, dude.

ETA: I’ve been pretty active in the comments of this post, because I see the same misconception over and over again, largely the result of me being unclear when I wrote part of this post.

Somehow, a bunch of people have turned into “Wil Wheaton says you have to do a, b, and c or you’re not a geek, so fuck him because he’s a dick.”

That’s not what I meant, at all. Most people seem to get that, but there’s enough who don’t that I feel a need to respond, in case you don’t feel like digging through hundreds of comments to find my replies in there.

I never meant to say that unless you do a or b or c even ∏, you don’t “qualify” for admittance to some super secret clubhouse where I am the gatekeeper. When I said, “…couldn’t tell you the difference between Slackware and Debian, a d8 and a d10, or how to use vi or emacs…” I didn’t mean that unless a person does know what these things are, they don’t pass some kind of test. I was making an example, picking out some things that I happen to be geeky about, in an attempt to illustrate a point, and I did that poorly.

I was not trying to be, and I don’t want to be, some kind of exclusionary geek elitist. That’s just the most incredibly stupid and offensive thing in the world.

As I said in a comment somewhere in this post: Creating a world where my kids don’t have to grow up being picked on for loving RPGs is awesome. But what I see – not just here, but in general at this moment – is a bunch of marketing jerks trying to take the things we love and turn them into something from Hot Topic. I didn’t mean “you’re not geeky enough…” at all, and I hate that people seem to latch on to that, because it means I wasn’t clear enough. If these guys I mentioned truly love what we are, and they have been here all along (and I’ve just missed them for my whole life) than it’s great that they’re not ashamed to love the things we love … but I haven’t seen anything to indicate that they genuinely are interested in the things we love as much as they are riding a pop-culture wave that’s driven by Twitter’s explosive and pervasive popularity. It feels calculated and planned out by PR and marketing people, and as someone who loves this culture, that bothers me. I didn’t mean to imply that you have to meet this list of criteria to come be part of our club (vi, d10, etc) as much as I was attempting to illustrate a point: we know what at least some of those things are, and Cool Kids have teased us for it our whole lives. It feels to me like those same people are now trying to take our culture away from us and make a quick buck off of exploiting it, and us. It was not my intention to create some sort of Geek Literacy Test. That’s lame. Like I said, all are welcome, but at least make an effort to understand why we care about these things.

Finally, I’ve been trading e-mails with Shira Lazar, who had this idea in the first place. She says:

Well, I think the hornet’s nest was stirred up a bit. But that’s ok. I rather open, honest discourse than people to feel shut off or alienated. That would be ridiculous and horrible.

Anyway- from reading the post and comments it’s important off the bat for people to know this isn’t a marketing ploy or some evil plan to take over the world. ha

also, It sucks that the d&d line got misconstrued. It’s important to point out that a lot of ppl besides you in the video actually do play the game- the line was more to say yes lots of geeks play d&d but you don’t need to play d&d to be a geek.

It really started as a fun way to bring people together, geeks of all extremes. To break down stereotypes. I consider myself a geek. Yes, the level of geekiness changes depending on the context. Amidst developers and my gamer friends, I might not know a lot but with some of my friends I’m queen geek. While I might not know certain things in certain situations, I still have a yearning and passion to know and learn and a love of accepting those geeks who do know it all. I was the editor of my high school newspaper and the first person to make it digital. I would hang out in my computer room at school until midnight working on photoshop and quark while my friends were out and about doing their thing. I participated in my high school science fairs and went to regionals twice. My mom is also a coordinator for children with special needs- i’ve seen kids that are alienated from their peers who need to know it’s ok and they have a place.

While some of us have struggled and some have not, some know more, some don’t- this was simply a video that was supposed to be a fun way to bring everyone together.