[

Most myths concerning the American West originated on the silver screen. Thanks to Hollywood, cowboys, Indians, gunfights, and outlaws paint a romanticized version of what really happened in the Old West.

Hollywood lied. Life in the Old West was vastly different from classic Spaghetti Westerns. Clean-cut cowboys portrayed by actors like John Wayne are a far cry from what cowboys were really like.

Letโ€™s shoot down those Old West misconceptions.

The Wild West Wasnโ€™t That Wild

The Wild West will have you imagining gun-slinging bandits, Native American ambushes, and sharp shooting cowboys wreaking havoc through the towns. But most towns were peaceful.

Historians even say the Wild West โ€œwas a far more civilized, more peaceful and safer place than American society today.โ€

There werenโ€™t local โ€œgovernmentsโ€, but private organizations and clubs existed that helped adjudicate matters of property, theft, and crime, and they kept things pretty orderly.

Page 1 / 12
Advertisement
Advertisement