Tuesday, June 30, 2015



Communism Sucks


The French Marxist anthropologist/sociologist Pierre Bourdieu theorized that economic capital was not the only source of power in a society. He defined capital in many forms. Symbolic capital is the power that symbols hold for us, cultural capital includes the power of knowledge, political capital the power to change people’s minds, and so on. However, at the root of society is economic power. Without it, nothing else is possible. So money is, for the purposes of this argument, power.

It then follows that if the government produces all the means for getting money or buying things, then it has all the power. It doesn’t matter what the law says, because that’s just a means of managing power in a society and it only works when there is incentive by those with power to do so–That is, checks and balances. In a society where the government has all the power, it has no incentive to check it’s own power, so the only means of the citizens getting change in their favor is for either their rulers to be benevolent, or outright revolution.

Capitalism allows for power to be distributed and for those checks and balances via laws to work, and it provides the means for regular citizens to gain power by converting it into cultural, symbolic, political or other forms of capital, which can get you more economic capital and so on. Laws and constitutions only work so long as the citizenry has both the means and the will to ensure that power is distributed and those with great power are kept in check, just as the Constitution was intended to do.

Now I'd like to explain why communism sucks. What follows are ten reasons I found on the internet for why communism sucks. I've made a few slight changes here and there because I'm a stickler for correct grammar and concise language, but it's all good stuff. Please read on.

10 - Communism doesn’t value creativity. The average person, as George Carlin once observed, is not particularly good at anything. The perfect job for such a person is on the assembly line. But regardless of the governments under which we live, we all have different aspirations. Some people are perfectly happy sweeping floors, but most of us—justly—want more out of life. Not only money, but fame, glory, and a sense of accomplishment. All of these require at least some creative thought.

You may want to be a poet or a painter, but these jobs certainly don’t pay the bills—and Communism views them as unnecessary and ridiculous. All that matters is building a super-powerful nation—and one of the first obstacles that must be removed is what Jefferson called “the pursuit of happiness.”

9 - Forced Collectivization. The most notorious example of forced collectivization is the land reform carried out by Soviets between 1928 and 1933. It was thought that collectivization would maximize the use and potential of the countryside for urban and industrial needs. Russian industry was just taking off, and enormous quantities of food would be required for the workers.

Masses of resisting landowners—many of them small-scale farmers who worked their own land—died at the hands of executioners. The state’s requisition of crops, livestock, and farmland was paid for by the farmers and by the lower class in general, some ten million of whom starved to death in five years.

Exactly the same atrocity took place in Communist China, between 1958 and 1961. During this time, private farming was outlawed as it had been in Stalin’s Russia, and about 33 million people starved to death in possibly the single most destructive famine in human history.

8 - A citizen has no rights under communism. Several of these entries are related, and the absence of citizens’ rights is at the heart of more than one. In keeping with the last entry, Marx advocated ten rules in his Communist Manifesto for the forced redistribution of all land and property for the good of the national community.

This is theft, from the citizens’ point of view. They are forced to join the new Communist government—whether they like it or not. This, of course, must be done with a “might is right” frame of mind: lots of men with guns show up and take everything you have “for the glory of the motherland,” as the Soviets might have said.

7 - Reduced incentives to work hard. Incentives—such as higher pay for doctors—are necessary to give people the energy they need to work hard in a difficult job.

When there are no extra incentives available—such as in a Communist state, where all reap an equal share in what some have worked harder to sow—the people in difficult jobs quickly lose their motivation. For example, workers would stop caring about how thoroughly they inspect the cars on the assembly lines, since it makes no difference to them either way. They are also likely to grow bitter at the government for failing to give them recognition when they do a good job. Revolts become a distinct liability; many a Communist state has fallen because of this problem of reduced incentives.

6 - Militant opposition to imperialism. It doesn’t take much to bring the fury of a Communist state upon you; in fact, it takes nothing more than simply existing in a capitalist state. The Communist Manifesto advocates the replacement of all governments by Communist governments. This has almost always been put into effect internally: the Russian monarchy was overthrown, as were the Republic of China and the Cuban democracy. But the threat is not merely internal. The US need not fear a Cuban invasion, but China is indeed a force to be reckoned with. It controls the second-largest portion of American debt, and though that only amounts to about eight percent, the number is rising. Should they ever call in America’s entire debt to them at once, America’s already depressed economy would be greatly harmed.

5 - Indifference towards the environment. With all the alarmist global warming nonsense, I'll bet you're surprised to hear this one. Well, the truth is that a Communist state will make up for its inefficient economy by doing whatever is necessary to produce crops and water. In the 1960s, the Soviet Communist regime diverted two important rivers for irrigation. The Aral Sea, which those rivers fed, has now shrunk to as little as ten percent of its original size. It used to be the fourth largest lake in the world.

The lesson: rather than letting the efficiency of capitalism into its economic model, the Soviet government chose to extract everything it could from the environment—without caring one bit about the health of that environment.

4 - The economic calculation problem. The relative success of the free market economy is a real-world refutation of Marxist economics. The latter never has sufficient information on the market prices of commodities, and therefore cannot properly ration the distribution of a nation’s resources.

The only reasonable criticism of the free market economy is the presence of monopolies, which can raise the prices of their products with little fear of reprisal. But monopolies are just like the central control a Communist government exercises on its whole economy; a true free market ensures that there be checks and balances on the price of goods and services.

3 - The class struggle’s goes nowhere. Marx founded Communist philosophy on the principle that class struggles have been, by far, the primary cause of all strife, wars, economic woes, and regime collapses. There are popularly thought to be three major classes of people: the upper, the middle, and the lower. The upper class has most of the wealth; the lower class the least; and the middle class plays the peacemaker between them, maintaining the hope and sanity of the lower class. Without the middle class, heads are chopped.

Communism itself does not erase the class struggle, as it proclaims, but keeps it going. It does this because it is a government: there must be a group of people in charge, and it’s likely that this group enjoys its power. By maintaining their power, the leaders of a Communist state separate the population into at least two classes: themselves as the upper class, and preferably everyone else in the lower class.

Communist states have generally not featured a middle class—and its absence allowed for the Russian Revolutions of 1905 and 1917; the Chinese of 1949, the Cuban of 1953-59, and a host of others. All of these revolutions ended with the rise of a Communist state—and all of them were the ruin of their respective nations, because the Communists themselves became the very same brand of elitist upper class they had deposed.

2 - State-sponsored mass murder. Communist rule may be directly blamed for the deaths of at least eighty-five million people in the twentieth century. Stalin alone murdered about twenty million, although other estimates range from fifty-three million to eighty million.

In 1975, the Khmer Rouge seized power in Cambodia, and set out to establish a Communist utopia. They immediately committed genocide on their own people. At least two million were executed by brutally primitive methods in keeping with the Khmer Rouge’s anti-technology stance; many of the victims were murdered merely for wearing glasses. Intelligence was deemed a direct and serious threat to the Khmer Rouge.

And let’s not forget Chairman Mao. He may not have been as evil as Stalin, but he was the very definition of indifference towards humanity. His “Great Leap Forward” caused the deaths by starvation of forty-five million Chinese civilians.

1 - Lastly, Karl Marx was wrong to begin with; Marx’s doctrine is fraught with faulty logic, loopholes, and unsolved problems. His idea of economics is based on the labor theory of value, which asserts that a car, for example, should cost more than a TV, because more labor is needed to produce it. But this is an oversimplification of the market.

Sam’s Choice Cola tastes almost identical to Coca-Cola, but costs half as much. The labor is the same, but people are happy to pay twice as much for the only difference: the brand name. The same holds true with medicine.

In the same way, tennis shoes can cost over $200 in the US, despite being made in China or Taiwan for only about $3–10. Why do they cost so much? Because the industries that own them sell them based on how highly they are in demand by the public. That’s why they have athletes endorse their products: to make them more desirable to the athletes’ fans.

This is expressly why Marxist communism has caused the utter collapse of so many national economies: it thinks in broad strokes, and fails to tell one subtlety from another. This, first and foremost, is because communism is not grounded in reality.

Ambrose Bierce


Ambrose Gwinnett Bierce (born June 24, 1842; assumed to have died sometime after December 26, 1913) was an American editorialist, journalist, short story writer, fabulist, and satirist. He wrote the short story "An Occurrence at Owl Creek Bridge" and compiled a satirical lexicon entitled The Devil's Dictionary. His vehemence as a critic, his motto "Nothing matters", and the sardonic view of human nature that informed his work, all earned him the nickname "Bitter Bierce".

Despite his reputation as a searing critic, Bierce was known to encourage younger writers, including poet George Sterling and fiction writer W. C. Morrow. Bierce employed a distinctive style of writing, especially in his stories. His style often embraces an abrupt beginning, dark imagery, vague references to time, limited descriptions, impossible events and the theme of war.

In 1913, Bierce traveled to Mexico to gain first-hand experience of the Mexican Revolution. While traveling with rebel troops, he disappeared without a trace.

Bierce was born at Horse Cave Creek in Meigs County, Ohio, to Marcus Aurelius Bierce (1799–1876) and Laura Sherwood Bierce. His mother was a descendant of William Bradford. His parents were a poor but literary couple who instilled in him a deep love for books and writing. The boy grew up in Kosciusko County, Indiana, attending high school at the county seat, Warsaw.

He was the tenth of thirteen children whose father gave all of them names beginning with the letter "A". In order of birth, the Bierce siblings were Abigail, Amelia, Ann, Addison, Aurelius, Augustus, Almeda, Andrew, Albert, Ambrose, Arthur, Adelia, and Aurelia. He left home at age fifteen to become a "printer's devil" at a small Ohio newspaper.

At the outset of the American Civil War, Bierce enlisted in the Union Army's 9th Indiana Infantry Regiment. He participated in the Operations in Western Virginia campaign (1861), was present at the "first battle" at Philippi and received newspaper attention for his daring rescue, under fire, of a gravely wounded comrade at the Battle of Rich Mountain. In February 1862 he was commissioned First Lieutenant, and served on the staff of General William Babcock Hazen as a topographical engineer, making maps of likely battlefields.

Bierce fought at the Battle of Shiloh (April 1862), a terrifying experience that became a source for several later short stories and the memoir, "What I Saw of Shiloh". In June 1864, he sustained a serious head wound at the Battle of Kennesaw Mountain, and spent the rest of the summer on furlough, returning to active duty in September. He was discharged from the army in January 1865.

His military career resumed, however, when in mid-1866 he rejoined General Hazen as part of the latter's expedition to inspect military outposts across the Great Plains. The expedition proceeded by horseback and wagon from Omaha, Nebraska, arriving toward year's end in San Francisco, California.

Bierce married Mary Ellen "Mollie" Day on 25 December 1871. They had three children; two sons, Day (1872–1889) and Leigh (1874–1901), and a daughter, Helen (1875–1940). Both of Bierce's sons died before he did: Day committed suicide due to depression over a romantic rejection, and Leigh died of pneumonia related to alcoholism. Bierce separated from his wife in 1888 after discovering compromising letters to her from an admirer. They divorced in 1904 and Mollie Day Bierce died the following year.

On Bierce's religious views, he was an agnostic.

Bierce suffered from lifelong asthma as well as complications from his war wounds.

n San Francisco, Bierce received the rank of brevet major before resigning from the Army. He remained in San Francisco for many years, eventually becoming famous as a contributor and/or editor for a number of local newspapers and periodicals, including The San Francisco News Letter, The Argonaut, the Overland Monthly, The Californian and The Wasp. A selection of his crime reporting from The San Francisco News Letter was included in The Library of America anthology True Crime.

Deadwood in 1876.
Bierce lived and wrote in England from 1872 to 1875, contributing to Fun magazine. His first book, The Fiend's Delight, a compilation of his articles, was published in London in 1873 by John Camden Hotten under the pseudonym "Dod Grile". Returning to the United States, he again took up residence in San Francisco. From 1879 to 1880, he traveled to Rockerville and Deadwood in the Dakota Territory, to try his hand as local manager for a New York mining company; when the company failed he returned to San Francisco and resumed his career in journalism.

In 1887, he published a column called "Prattle" and became one of the first regular columnists and editorialists to be employed on William Randolph Hearst's newspaper, the San Francisco Examiner, eventually becoming one of the most prominent and influential among the writers and journalists of the West Coast. He remained associated with Hearst Newspapers until 1906.

Bierce was considered a master of pure English by his contemporaries, and virtually everything that came from his pen was notable for its judicious wording and economy of style. He wrote in a variety of literary genres.

His short stories are held among the best of the 19th century, providing a popular following based on his roots. He wrote realistically of the terrible things he had seen in the war in such stories as "An Occurrence at Owl Creek Bridge", "The Boarded Window", "Killed at Resaca", and "Chickamauga".

In addition to his ghost and war stories, he also published several volumes of poetry. His Fantastic Fables anticipated the ironic style of grotesquerie that became a more common genre in the 20th century.

One of Bierce's most famous works is his much-quoted book, The Devil's Dictionary, originally an occasional newspaper item which was first published in book form in 1906 as The Cynic's Word Book. It consists of satirical definitions of English words whith lampoon and political double-talk.

Under the entry "leonine", meaning a single line of poetry with an internal rhyming scheme, he included an apocryphal couplet written by the fictitious "Bella Peeler Silcox" (i.e. Ella Wheeler Wilcox) in which an internal rhyme is achieved in both lines only by mispronouncing the rhyming words:

The electric light invades the dunnest deep of Hades.
Cries Pluto, 'twixt his snores: "O tempora! O mores!

Bierce's twelve-volume Collected Works were published in 1909, the seventh volume of which consists solely of The Devil's Dictionary, the title Bierce himself preferred to The Cynic's Word Book.

In October 1913 Bierce, then aged 71, departed Washington, D.C., for a tour of his old Civil War battlefields. By December he had passed through Louisiana and Texas, crossing by way of El Paso into Mexico, which was in the throes of revolution. In Ciudad Juárez he joined Pancho Villa's army as an observer, and in that role he witnessed the Battle of Tierra Blanca.

Pancho Villa
Bierce is known to have accompanied Villa's army as far as the city of Chihuahua. His last known communication with the world was a letter he wrote there to Blanche Partington, a close friend, dated December 26, 1913. After closing this letter by saying, "As to me, I leave here tomorrow for an unknown destination," he vanished without a trace, becoming one of the most famous disappearances in American literary history. Skeptic Joe Nickell argued that such a letter had never been found. All that existed was a notebook belonging to his secretary and companion, Carrie Christiansen, containing a rough summary of a purported letter and her statement that the originals had been destroyed.

Oral tradition in Sierra Mojada, Coahuila, documented by the priest James Lienert, states that Bierce was executed by firing squad in the town cemetery there. Again, Nickell finds this story to be rather incredible. He quotes Bierce's friend and biographer Walter Neale as saying that in 1913, Bierce had not ridden for quite some time, was suffering from serious asthma, and had been severely critical of Pancho Villa. Neale concludes that it would have been highly unlikely for Bierce to have gone to Mexico and joined up with Villa.

All investigations into his fate have proven fruitless, and Nickell concedes that despite a lack of hard evidence that Bierce had gone to Mexico, there is also none that he had not. Therefore, despite an abundance of theories (including death by suicide), his end remains shrouded in mystery.

Ambrose Bierce was influenced by such writers as Jonathan Swift, Voltaire, Edgar Allan Poe, and Mark Twain. Birce influenced writers such as H.L. Mencken, William March, Jorge Luis Borges, Julio Cortázar, Stephen Crane, and Ernest Hemingway.

Monday, June 29, 2015

Climb El Capitan


Climb El Capitan in Yosemite National Park without expending any calories at this cool interactive Googlemaps site.

Camping 1700 feet up the face of El Capitan.

Too Many People


Top 10 Most Populous Countries

1. China1,361,512,535  6. Pakistan199,085,847
2. India1,251,695,584  7. Nigeria181,562,056
3. United States321,362,789  8. Bangladesh168,957,745
4. Indonesia255,993,674  9. Russia142,423,773
5. Brazil204,259,812  10. Japan126,919,659

Last Saturday evening, while telling me that the new SCOTUS ruling for same sex marriage was a good thing, my neighbor alluded to the fact that there are 3 or 4 billion people living in the USA. I attempted to correct him, but he was pretty sure he was right.

Apophenia


Apophenia is the experience of seeing meaningful patterns or connections in random or meaningless data.

The term is a misnomer incorrectly attributed to Klaus Conrad by Peter Brugger, who defined it as the "unmotivated seeing of connections" accompanied by a "specific experience of an abnormal meaningfulness", but it has come to represent the human tendency to seek patterns in random information in general (such as with gambling), paranormal phenomena, and religion.

In 1958, Klaus Conrad published a monograph entitled Die beginnende Schizophrenie. Versuch einer Gestaltanalyse des Wahns ("The onset of schizophrenia. Attempt to shape analysis of delusion", non translated and published yet in English language though), in which he described in groundbreaking detail the prodromal mood and earliest stages of schizophrenia. He coined the word "Apophänie" to characterize the onset of delusional thinking in psychosis. This neologism is translated as "apophany," from the Greek apo [away from] + phaenein [to show], to reflect the fact that the schizophrenic initially experiences delusion as revelation. In contrast to epiphany, however, apophany does not provide insight into the true nature of reality or its interconnectedness, but is a "process of repetitively and monotonously experiencing abnormal meanings in the entire surrounding experiential field" which are entirely self-referential, solipsistic and paranoid: "being observed, spoken about, the object of eavesdropping, followed by strangers". In short, "apophenia" is a misnomer that has taken on a bastardized meaning never intended by Conrad when he coined the neologism "apophany."

Faces on Mars.
In 2008, Michael Shermer coined the word 'patternicity', defining it as "the tendency to find meaningful patterns in meaningless noise". In The Believing Brain (2011), Shermer defines patternicity as "the tendency to find meaningful patterns in both meaningful and meaningless noise". The Believing Brain thesis also says that we have "the tendency to infuse patterns with meaning, intention, and agency", which Shermer calls 'agenticity'.

In statistics, apophenia is known as a Type I error — the identification of false patterns in data. It may be compared with a so-called false positive in other test situations.

The attempt to foretell the future, present, or past by finding patterns in animal entrails, tossed sticks, or by picking random passages from a holy text are further examples of apophenia.

A more extreme example is the pareidolia associated with finding the faces of religious figures in pieces of toast, the grain of cut wood, or other such patterns. 21st century real-world examples include the finding of a cross inside a halved potato; the appearance of Jesus and Mary inside a halved orange; and the appearance of Jesus' face on a piece of toast, in the frost on a car window, and inside the lid of a jar of Marmite.

Apophenia is heavily documented as a source of rationale behind gambling, with gamblers imagining they see patterns in the occurrence of numbers in lotteries, roulette wheels, and even cards. One variation of this is known as the Gambler's Fallacy.

Carl Jung coined the term synchronicity for the "simultaneous occurrence of two meaningful but not causally connected events" creating a significant realm of philosophical exploration. This attempt at finding patterns within a world where coincidence does not exist possibly involves apophenia if a person's perspective attributes their own causation to a series of events. "Synchronicity therefore means the simultaneous occurrence of a certain psychic state with one or more external events which appear as meaningful parallels to a momentary subjective state". (C. Jung, 1960)

Pareidolia is a type of apophenia involving the perception of images or sounds in random stimuli, for example, hearing a ringing phone while taking a shower. The noise produced by the running water gives a background from which the brain perceives there to be patterned sound of a ringing phone. A more common human experience is perceiving faces in inanimate objects; this phenomenon is not surprising in light of how much processing the brain does in order to memorize and recall the faces of hundreds or thousands of different individuals. In one respect, the brain is a facial recognition, storage, and recall machine - and it is very good at it. A byproduct of this acumen at recognizing faces is that people see faces even where there is no face: the headlights & grill of an automobile can appear to be "grinning", individuals around the world can see the "Man in the Moon", and a drawing consisting of only three circles and a line which even children will identify as a face are everyday examples of this.

In fiction, Postmodern novelists and film-makers have reflected on apophenia-related phenomena, such as:

Use of apophenia in text/plot in Peter Watts's Blindsight and paranoid narration or fuzzy plotting in Vladimir Nabokov's "Signs and Symbols"; Thomas Pynchon's The Crying of Lot 49 and V.; Alan Moore's Watchmen; From Hell (specifically Appendix II of From Hell titled "Dance of the Gull Catchers"); Umberto Eco's The Name of the Rose and Foucault's Pendulum; William Gibson's Pattern Recognition.

As narrative is one of humanity's major cognitive instruments for structuring reality, there is some common ground between apophenia and narrative fallacies such as hindsight bias. Since pattern recognition may be related to plans, goals, and ideology, and may be a matter of group ideology rather than a matter of solitary delusion, the interpreter attempting to diagnose or identify apophenia may have to face a conflict of interpretations.

Lastly, a word about the Dark Side of the Rainbow – also known as Dark Side of Oz or The Wizard of Floyd – which refers to the pairing of the 1973 Pink Floyd album The Dark Side of the Moon with the visual portion of the 1939 film The Wizard of Oz. This produces moments where the film and the album appear to correspond with each other. Band members and others involved in the making of the album state that any relationship between the two works of art is merely a coincidence. Here is a link to the Youtube video.

Friday, June 26, 2015

Mad Max: Fury Road


Last week I downloaded the new Mad Max movie (Fury Road) and after the day's chores were accomplished, I skipped dinner and made up a batch of popcorn. Prior to settling in, I journeyed out and purchased a Dr. Pepper as well because, a movie, popcorn, and soda are more than a habit with me. They're an institution.

Okay, all prepped, I put up my feet and started the show. I had read a few reviews already that had prepared me for some sort of feminist diatribe on the more vulgar aspects of the male population as it related to a post-apocalyptic vision. Additionally, I was already intimately familiar with the Mad Max mythology: the lone road warrior in a souped-up hot rod surviving on the highways in a world gone mad. As I discovered, being aware of what transpired before in the old Mel Gibson universe was not necessary at all.

With that said, I'm not even going to try to tell you what Fury Road is "about." All you need to know is that it's a Mad Max movie.

Tom Hardy plays the new and updated Mad Max. He's still the strong silent type, but now perhaps a bit more to the full-blown gone crazy with lonliness type. There's even a little speech one of the characters gives to Max that says pretty much the same thing. In any case, Hardy does an okay job with the role.

The real scene-stealer is Charlize Theron. She portrays a kick-ass, one-armed, female warrior from the clan at the heart of the crazy story. Charlize is the true hero of this simplistic scifi fantasy, although she couldn't have pulled off saving everybody if it had not been for Max. The truth is, the storyline is weak, but then so are the storylines in the other Mad Max movies. Face it, we watch such drivel for the car chases, the violence, the insanity of an end of the world scenario and for the soda and popcorn experience. All that taken into consideration, there are really only two reasons to watch: 1) Charlize Theron, and 2) the incredible cinematography.

I know, I know, it all takes place in a desert, but my goodness, what director George Miller has done with the desert. Watch this one in high definition on a big screen. Two thumbs up. Embedded feminism? Meh, who cares, it was fun.

Procrastination



I struggle with procrastination every day. Mainly, it's a diversion from my daily writing chore, which is directed at completing my next novel. I am pretty far along but still I thought I'd be done by now. I would be if I didn't procrastinate so much.

I generally write an essay per day on the side or blog entries in various internet venues, so it's not like I don't write at all. It's just that these large projects get to be terribly fatiguing.

The current piece is probably the longest (pagewise) of my less than celebrated career (600+ pages). I'm on the downhill run with 12 or 13 chapters to go. It will be finished... and soon. Yet, I know I will wrestle with the manuscript some more before I can call it complete.

Procrastination is a mind game we play with ourselves. We make up rewards and dole them out at milestones and chapter endings like a carrot on a stick. It could be that in my case, procrastination is a built-in mechanism to make me think more carefully about what I'm doing, to slow me down, to cause me to reflect on the arc of the story.

It could also be that anything I say is a rationalization about why I haven't finished my latest book. In all truth, reality does bear down from time to time; the yard needs to be mowed, cats must have their shots, dinner must be cooked, dishes washed, teeth brushed.

I can see where writing this little bit has been sort of a pep talk to myself, like the little girl in the video mimicking Shia Leboeuf. Just do it, she says. Make your dreams come true. Okay, little girl, I think I will.

My new novel is science fiction and I think it will be entitled... (not yet copyrighted, so I'll keep it to myself). Anyway, it's about transdimensional communication, far away space places, and time travel, but it's also a bit of a romance.

When I'm done dragging my feet (just do it), I think it's going to be an interesting, exciting read, a real rabbit hole of a story. It's coming soon and I hope you enjoy it.


Justice = Money


Why we spend billions to keep half a million unconvicted people behind bars...

June 11, 2015, Washington Post
Full article here.

At any given time, roughly 480,000 people sit in America's local jails awaiting their day in court, according to an estimate by the International Centre for Prison Studies. These are people who have been charged with a crime, but not convicted. They remain innocent in the eyes of the law. Three quarters of them... are nonviolent offenders, arrested for traffic violations, or property crimes, or simple drug possession. Many will be found innocent and have their charges dropped completely. Defendants who [are] detained before trial [wait] a median of 68 days in jail. Many... are forced to wait simply because they can't afford to post bail. A 2013 analysis by the Drug Policy Alliance... found that nearly 40 percent of New Jersey's jail population fell into this category. People sit behind bars not because they're dangerous, or because they're a flight risk, but simply because they can't come up with the cash. A recent analysis by the Vera Institute... found that 41 percent of New York City's inmates were sitting in jail on a misdemeanor charge because they couldn't meet a bail of $2,500 or less. For low income people, the consequences of a pre-trial detention, even a brief one, can be disastrous. And in many cases, these people will eventually be found to be innocent. Some civil rights reformers [argue] that bail policies are tantamount to locking people up for being poor. We spend somewhere in the ballpark of $17 billion dollars annually to keep innocent people locked up as they await trial.

Thursday, June 25, 2015


"When love is not madness it is not love."
- Pedro Calderon de la Barca


That Little Old Band From Texas



All right, just one more...


Billy Gibbons


Billy Gibbons
I spent my wonder years growing up in the lily-white suburbs of southwest Houston. By the time I made it to junior high school, pop music had taken over the radio airwaves and had begun a steady beat that eventually became the unifying basis for massive cultural changes. I was a pop music junkie and musicians became my idols, especially rock and roll guitar players. The wilder, the louder, the better. Back then, as I struggled to learn guitar chords, I became aware that there were other kids around me who were similarly infected. Billy Gibbons was one of them who actually came to be a guitar legend himself and one of the best players in the world. If it weren't for all the tone deaf yankees out there, Billy would be considered a national treasure. As it is, we'll just have to claim him as one of the state's natural resources.

Class of 1968.
Billy was born to Frederick Royal (Freddie) and Lorraine Gibbons in the Tanglewood suburb of Houston, Texas. His father was an entertainer, orchestra conductor, and concert pianist who worked alongside second cousin, art director Cedric Gibbons, for Samuel Goldwyn at MGM Studios. In 1963, Gibbons received his first electric guitar following his thirteenth birthday, a sunburst Gibson Melody Maker, accompanied by a Large Cat amplifier and was influenced by guitarists such as Jimmy Reed. While attending Warner Brothers' art school in Hollywood, California, Gibbons engaged with his first bands including The Saints, Billy G & the Blueflames, and The Coachmen. By 18, Gibbons went forward forming an artfully designed band, conceptually inspired by friend and fellow musician, Roky Erickson and The 13th Floor Elevators, naming the group the "Moving Sidewalks", penning the hit single, "99th Floor" and engaging a friendship with Jimi Hendrix. Hendrix went on to say on The Tonight Show, and "The Dick Cavett Show" that Gibbons would be the next hottest guitarist.


Gibbons founded the Texas psychedelic group The Moving Sidewalks, which recorded several singles and one full-length album, Flash. Gibbons and The Moving Sidewalks came to prominence opening for The Jimi Hendrix Experience during Hendrix's first American tour as a headliner. Also notable was the Gibbons-penned song, "99th Floor," its title a nod to the influence on Gibbons of fellow Texans and pioneering psychedelic band The 13th Floor Elevators. He has also commented during live performances while playing the string-bending intro to "Foxy Lady" that Hendrix taught him how to play the song when Gibbons was "about 17" in Dallas. Longstanding rumors have it that at the end of the tour Hendrix gave Gibbons the pink Stratocaster he had been playing as token of his appreciation for Gibbons' level of talent, and that Hendrix subsequently stated that Gibbons was one of the best guitarists in the US.

The Moving Sidewalks w/Go-Go dancer.
Gibbons formed ZZ Top in late 1969, which then quickly settled on bassist/vocalist Dusty Hill and drummer Frank "Rube" Beard, both being members of the band American Blues. After honing their trademark blues-rock style, they released the aptly titled ZZ Top's First Album on London Records in 1971.

The band rolled on, intensively touring and recording/releasing albums until 1977, when they took an extended hiatus. Their long-time manager took this time to negotiate a deal that allowed the band to keep control of their previous recordings, to be distributed by their new label, Warner Bros. Records. They reunited two-and-a-half years later in order to start recording under a new Warner Bros. contract. Unbeknownst to each other at the time, both Dusty Hill and Billy Gibbons had grown the chest-length beards that quickly became a part of their image.

The band hit international prominence and their commercial peak with the release of 1983's diamond-selling disc Eliminator. Eliminator was named after Gibbons' customized 1933 Ford Coupe, which was featured in three of the band's music videos. This vehicle is on exhibition at the Rock and Roll Hall of Fame in Cleveland, Ohio. The album featured the hits "Gimme All Your Lovin'", "Sharp Dressed Man", and "Legs".

A wave of music videos for the hit singles "Legs", "Gimme All Your Lovin'", and "Sharp Dressed Man", among others, became mainstays on MTV.

Despite ZZ Top's loss of their early fans with radio-friendly sound and blunders such as the remixed compilation Six Pack (1987), the band's unique blend of boogie and humorous, sometimes raunchy, lyrics, supported by Gibbons' blues-based prowess, continues to attract fans. In recent years, Gibbons has made appearances with other artists and acted on television shows, most notably Bones. He was ranked at number 32 on the 2011 Rolling Stone list of the 100 Greatest Guitarists of All Time. ZZ Top's album La Futura was released in September 2012.

Dusty Hill, Billy Gibbons, & Frank Beard
In 2004, ZZ Top was inducted into the Rock and Roll Hall of Fame. They have the distinction of being among a very small group of bands with a 40-year-plus history that still has all of its original members.

In December 2005, Gibbons married long-time girlfriend Gilligan Stillwater.

Gibbons is an avid car collector and custom car enthusiast with an extensive collection that includes 1948 Cadillac Series 62 (known as CadZZilla), 1962 Chevrolet Impala (known as Slampala), 1950 Ford Business Coupe, and a 1958 Ford Thunderbird. Gibbons also published a book in 2011 about his love of cars and guitars titled Billy F Gibbons: Rock + Rock Gearhead. The November 2014 issue of Guitar World magazine features an interview of Gibbons and fellow guitarist Jeff Beck about their mutual appreciation of "cars, guitars, and everything in between".


Thanks, Billy.


Tuesday, June 23, 2015

Simulations Indistinguishable From Reality


You've probably heard of the concept of the technological singularity. It is a hypothetical moment in the near future when artificial intelligence becomes indistinguishable from human intelligence—and capable of reproducing itself. When you apply the same general idea to simulations, you get the "simulation singularity": that's when a simulated world is indistinguishable from reality.

Engineer Andy Fawkes, who works for global simulation software company Bohemia Interactive Simulations (BIS) and is director of tech and training company Thinke, asks "Will there be a world where the simulation may be just as good as the real world?"

"In a sense, I think in some regards it’s already happening," Fawkes added. If people’s minds are already accepting a simulated world as “real” somehow, then we could perhaps consider that we’ve already reached the tipping point.

Another example that shows the power of realistic simulators is in the military sphere: pilots learn to fly using simulators that effectively trick the brain into thinking it’s actually controlling a plane. In the US, a quadriplegic woman was able to "fly" an F-35 fighter jet using nothing but her mind. Fawkes looks forward to strapping on an Oculus Rift in old age and “escaping” his weary body.

The truth is, reality is not just about the visuals; it’s a whole complex system that, so far, is impossible to model with any great accuracy because of the vast number of uncontrolled and misunderstood variables.

Take the weather, for instance, which we’re still not great at modelling even over a few days in advance, never mind 100 years. In this sense, the chaos of reality still beats our best simulators by a long shot.

It may never be possible to predict the actions of chaotic systems, but even improvements that are minor in scope compared to the idea of a total singularity could have a huge effect, like being able to predict the weather for a month, or better model asteroid impacts.

So maybe we’re not that close to a true simulation singularity after all. However, as far as simply confusing the human senses about what’s real and what’s not, well, we're already there. You don’t need perfect graphics to induce suspension of disbelief in the human brain anyway: Just think about how your mind can get carried away watching a film or reading a book. On some level you know it’s not real, but that doesn’t stop you from being emotionally invested. People get married in Second Life.

The human brain is one of the best simulators we’ve got. Things take an inevitable philosophical turn at this juncture: if simulations are so realistic (or we’re so gullible), then how do we know we’re not already in a simulation? It’s a question that’s been pondered by everyone from Plato with his cave allegory to Matrix fanboys to philosopher Nick Bostrom with his simulation hypothesis.

There’s a link to be drawn here with the broader technological singularity—because if we are living in a simulation, who or what is behind it? Even as far as we know, a lot of digital simulations rely on some sort of AI to inform what they show.

On the other hand, artificially intelligent robots often use simulations to train for the real world, which might be handy when it comes to warding off potential negative consequences along the way to improved AI. "If you’re going to have artificial intelligence in the real world, maybe it’s best to test it in a simulation first," Fawkes suggested.

The trouble with simulation training is that a robot would have difficulty distinguishing between reality and a simulation, similar to humans having difficulty distinguishing between what is real and what is not when simulation training is done in real world scenarios. After all, when accepted authorities use stagecraft to create real world simulations, the illusion becomes overwhelmingly strong. That, of course, is what propaganda is all about.

Such scenarios are frequently foisted upon the public with great success in order to influence, control, and guide behavior. These are simulations in their own right but they also commonly use wondrous CGI or human actors to fully convince. If you didn't know already, on 12/29/12, President Obama signed HR 4310, the 2013 National Defense Authorization Act. Section 1078 (thomas.loc. gov/cgi-bin/query/z?c112:H.R.4310:) authorizing the use of propaganda inside the US, which had previously been banned since 1948 when the Smith-Mundt Act was passed.

You won't hear the truth about these real world simulations on the evening news. They are, by their very nature, secretive operations. After all, when you want to sell someone something they don't want, you don't tell them the whole deal is a scam.

Whether you want to call such things simulations or illusions, the intent is the same -- a distortion of reality (yes, I assure you there is still an absolute reality out there) intended to ensnare the hearts and minds of people for the purposes of control. It's a kind of enslavement and when presented flawlessly, guarantees that none of us will live free. You see, without truth, there can be no freedom. What we currently have is the illusion of freedom. If enough people ever caught on, most likely those behind the illusion would have hell to pay.

Monday, June 22, 2015

Favorite Movie Scenes #3



You Mad, Bro?


Confederate Memorial Day 2016
Tuesday, January 19, 2016 (local in Texas)

Avogadro's Number


Amedeo Avogadro
Avogadro's number, also known as Avogadro's constant, is defined as the quantity of atoms in precisely 12 grams of 12C. The designation is a recognition of Amedeo Avogadro, who was the first to state that a gas' volume is proportional to how many atoms it has. It's really self-evident when you think about it. This number is given as 6.02214179 x 1023 mol-1.

Amadeo Avogadro lived in the early 19th century and was an Italian savant known for his role in many different scientific disciplines. His most famous statement is known as Avogadro's Law, and is a hypothesis that states, "Equal volumes of ideal or perfect gasses, at the same temperature and pressure, contain the same number of particles, or molecules."

It is an intriguing hypothesis, because it says that quite different elements, such as nitrogen and hydrogen, still have the same number of molecules in the same volume of an ideal gas. While in real world settings this is not strictly true, it is statistically quite close, and so the ideal model still has a great deal of value.


The constant can be expressed as (p1)(V1)/(T1)(n1) = (p2)(V2)/(T2)(n2) = constant; where p is the pressure the gas is at, T is the temperature it is at, V is the volume of gas, and n is the number of molar units.

Part of Avogadro's genius is that he was able to see this fundamental relationship long before the experimental evidence was available to validate it. His innate understanding of the nature of ideal gasses was astounding, and it wasn't until decades later that experimental evidence finally supported his hypothesis.

In the 1860s, more than 50 years after Avogadro first made his hypothesis, the Austrian high school teacher Josef Loschmidt calculated how many molecules were in a single cubic centimeter of a gas under typical pressure and temperature. He determined this to be approximately 2.6X1019 molecules, a number now known as Loschmidt's Constant, and which has since been expanded to 2.68677725X1025 m-3.

Throughout the early years of the 20th century, a search was undertaken to discover the precise value of Avogadro's number. Molecules were still largely theoretical entities to many scientists until the early part of the 20th century, and so actually determining the value through experiment was not feasible. Once it became feasible, however, it was immediately apparent that the value was important, as it reflected on the fundamental nature of ideal gasses.

The name "Avogadro's number" was first used in a paper from 1909, by the scientist Jean Baptiste Jean Perrin, who later went on to win the Nobel Prize in Physics in 1926. He stated in the paper that, "The invariable number N is a universal constant, which may be appropriately designated 'Avogadro's Constant.'"

For years leading up to the 1960s, there was some dispute as to the actual value of this number. Some factions used oxygen-16 to base their calculation on, while others used a naturally occurring isotope of oxygen, leading to slightly different values. In 1960, the constant was changed to be based on carbon-12, making the number much more regular.

Sunday, June 21, 2015



MSM Fabricates Stories


MSM fabricates stories using actors and reports them as truth in deceptive news theater

by Ethan A. Huff
(NaturalNews)

The mainstream media has once again been caught making up news stories, this time with a bogus report about a naked man escaping from a window at Buckingham Palace. Reports indicate that NBC News, apparently unaware that the pre-planned stunt was part of a scene for a drama show on its own network, picked up the story, which quickly went viral, and spread it around as truth before later being outed for reporting a lie.

Sliding down a rope at the official London residence of Britain's royal families, the mostly naked man was reportedly an actor on the E! Network show The Royals, which is owned by NBC. A crude video clip of the man, which was captured on a shaky mobile phone camera, was apparently uploaded to YouTube as if it were real. And it quickly went viral, eventually making its way onto NBC News as a real event.

But the whole thing was a cleverly designed press campaign that, according to an E! Insider, "tricked even NBC News." Whether or not this is actually true, or if NBC News knowingly reported the fake event and later played dumb in order to help promote it, is anyone's guess. But in either case, this manufactured piece of news is just the latest example of MSM news fabrication.

Brian Williams
Yes, but he's a team player...
The Washington Post, Brian Williams, Bill O'Reilly and many others all caught making up fake news stories The Washington Post did the same thing back in 2013 when it erroneously reported that former vice presidential candidate and Alaska governor Sarah Palin had joined the Qatari-owned news network Al Jazeera as a host and commenter. The original story had appeared on a satirical news website, which WP should have known, and yet the esteemed news outlet reported it as fact.

More recently, former NBC Nightly News anchor Brian Williams was caught lying about his involvement in a fire fight during the U.S. invasion of Iraq. Williams had claimed that he was aboard a military helicopter that was shot by enemy forces and forced to land, which later turned out to be completely fabricated.

Fox News personality Bill O'Reilly was also caught making up similar war stories on his show The O'Reilly Factor earlier this year. Claiming that he was on-location covering the civil war in El Salvador in 1980 when he supposedly witnessed four nuns get murdered, O'Reilly was later exposed as not even having gone aboard as a correspondent in El Salvador until 1981.

We Need to Talk about Sandy Hook
The war in Iraq, Sandy Hook, Disneyland measles and more: an American legacy of manufactured news. Larger-scale news stories like the Sandy Hook shootings, the supposed Disneyland measles outbreak, weapons of mass destruction in Iraq and much more have also had their validity questioned, as the official stories surrounding each of these events are riddled with holes and incongruencies. The New York Times actually admitted back in 2005 that the government manufactures fake news stories and peddles them throughout the mainstream media as fact.

At the time, the Bush Administration was doing everything it could to perpetuate the myth that the war in Iraq was a success, and that the creation of the Transportation Security Administration (TSA) in response to "terrorism" was "one of the most remarkable campaigns in aviation history." One news clip that aired at the time even showed what appeared to be an overjoyed Iraqi-American celebrating the fall of Baghdad and thanking then-president Bush for bringing democracy to Iraq.

As it turns out, each of these events and more were completely fabricated by the government, as admitted by the NYT.

Liar, liar, pants on fire.
"[T]he federal government has aggressively used a well established tool of public relations: the prepackaged, ready-to-serve news report that major corporations have long distributed to TV stations to pitch everything from headache remedies to auto insurance," admits the report.

"In all, at least 20 federal agencies, including the Defense Department and the Census Bureau, have made and distributed hundreds of television news segments in the past four years, records and interviews show."

And if it happened under Bush, it is surely still happening today.


"People who claim that they're evil are usually no worse than the rest of us... It's people who claim that they're good, or any way better than the rest of us, that you have to be wary of." - Gregory Maguire

This is why I quit assuring people I was a good person. I don't think they believed me anyway.

Friday, June 19, 2015

Tyranny Breeds Disregard For Sanctity Of Life


Study Finds That EVERY Police Department in US Failed To Meet Use Of Lethal Force Standards!

‘Shocking lack of fundamental respect for the sanctity of human life’

A study undertaken by Amnesty International USA has found that every state in the US is failing to comply with the minimum international standards on the lethal use of force by police.

The report also found that 13 US states, more than a quarter, fall beneath the legal standards outlined in US constitutional law, while 9 of those 13 have NO laws whatsoever that encompass lethal use of force.

This means that in 9 states, police can kill someone and avoid the consequences by claiming they had no choice but to use lethal force.

“While law enforcement in the United States is given the authority to use lethal force, there is no equal obligation to respect and preserve human life. It’s shocking that while we give law enforcement this extraordinary power, so many states either have no regulation on their books or nothing that complies with international standards,” Amnesty USA executive director Steven Hawkins told the London Guardian.

Hawkins described the findings as evidence that law enforcement departments have a “shocking lack of fundamental respect for the sanctity of human life.”

The study compared the statutes regarding use of lethal force of all 50 states against international principles which outline that lethal force is only ever used “in order to protect life” in “unavoidable” circumstance and after attempts to employ “less extreme means” to manage the situation.

International standards also outline that law enforcement officers should always identify themselves and give a clear warning if they intend to use deadly force.

The study found that not one single US state complies with both these standards, and only 8 states have a requirement of a verbal warning before engaging in the use of deadly force.

“None of the laws establish the requirement that lethal force may only be used as a last resort with non-violent means and less harmful means to be tried first. The vast majority of laws do not require officers to give a warning of their intent to use firearms.” the study concluded.

Amnesty noted that the 13 states that fall below US constitutional standards have statutes which are so vague in their wording, that they can easily be manipulated to allow for use of force in practically any circumstance.

The report notes that in North Dakota, police are sanctioned to use deadly force if an individual “has committed or attempted to commit a felony involving violence.” The level and scope of said violence and felony are not outlined in any way.

The 9 states that do not have any laws regarding lethal use of force are Maryland, Massachusetts, Michigan, Ohio, South Carolina, Virginia, West Virginia, Wisconsin and Wyoming – in addition to Washington D.C.

The report notes that this means police in those state nearly always investigate the actions of their own officers based on some arbitrary standard they have comprised for the specific circumstance.

Perhaps the most troubling finding in the report outlines how in nine states, police are legally permitted to use lethal force during “rioting”. The study found that in Pennsylvania, lethal force can be used if it is deemed “necessary to suppress a riot or mutiny after the rioters or mutineers have been ordered to disperse.”

International standards on lethal force also require all police related deaths to be reported. The central database for this activity to be logged, an FBI database, is completely voluntary, however. This means it is not really known how many “justifiable homicides” there are in the US, and the figure could be exponentially more than official records show.

Amnesty recommends a nationwide review of police use of lethal force laws, in addition to a thorough review and reform of oversight and accountability mechanisms at all levels of government.

Given the recent spotlight on police brutality in the US, Amnesty believes that “this report will produce some energy for change.”

Of course, US states are not beholden to comply with international laws. However, the findings, correlated with the huge number of police related killings in the US compared to other developed nations, paints a clear picture.

“Those states can of course argue that they follow common law or supreme court standards, but is that good enough?” Hawkins said. “Certainly we would expect that international human rights standards are what should govern and our fear is that, unless these are clearly quantified, a citizen in any state can’t look at what the law is. That’s critically important to ensuring accountability.”

A separate study recently compiled by Fatal Encounters, an impartial nonprofit organisation working to build a national database of police killings in the US, found that cops in the US are responsible for way more deaths on American soil than terrorism since the year 2000. Indeed, in that time, police have killed at least 5,600 people via gunshots, taserings, beatings and other forms of violence. That figure represents more than the total number of US combat deaths in all wars since 2000.

Americans are, at the very least, eight times more likely to be killed by a police officer than a terrorist.

Vox took the data gathered by Fatal Encounters and created an interactive map of every documented police killing over the past 15 years. Go here to view the map.

The organisation estimates that it has only captured about 35 percent of total police killings since 2000 so far. So at best, this map represents a minimum of police related killings over the past 15 years.

By those calculations, around SIXTEEN THOUSAND Americans are likely to have been killed by police in that time. Over 1000, every year.

In comparison with other first world nations; only three people were killed by police in 2014 in the UK; 12 people in Canada, and eight over the past two years in Germany. All this despite the fact that the crime index highlights that countries like the UK aren’t that far behind America in regards to overall crime rate.

The level of police killings appears to be escalating into an epidemic. It is indicative of an endemic societal divide between Americans and their government (yes police work for the government).