[

Most myths concerning the American West originated on the silver screen. Thanks to Hollywood, cowboys, Indians, gunfights, and outlaws paint a romanticized version of what really happened in the Old West.

Hollywood lied. Life in the Old West was vastly different from classic Spaghetti Westerns. Clean-cut cowboys portrayed by actors like John Wayne are a far cry from what cowboys were really like.

Let’s shoot down those Old West misconceptions.

The Wild West Wasn’t That Wild

The Wild West will have you imagining gun-slinging bandits, Native American ambushes, and sharp shooting cowboys wreaking havoc through the towns. But most towns were peaceful.

Historians even say the Wild West “was a far more civilized, more peaceful and safer place than American society today.”

There weren’t local “governments”, but private organizations and clubs existed that helped adjudicate matters of property, theft, and crime, and they kept things pretty orderly.