Monday, October 10, 2022

Is it time to redefine what "woke" means?

 In our human experience, we assign labels to almost everything.  The label "woke" arose over the past few years.  "Woke" has been defined as being self ware, questioning the dominant paradigm and striving for something better.  When the "woke" culture pushed for more racial diversity and inclusion of facts into school curriculums, the dominant culture began to push back.  For example, historical information on slavery or the misdeeds of European Americans being taught in schools was frowned on by many in the dominant culture.  Their rationale was "we don't want our children to be made to feel guilty about what happened in the past".  In the state of Florida, the governor actually prides himself on being "anti-woke culture".  But, does he even understand what the issues are about in being "woke"?

So let's discuss this "woke" issue.  I personally believe the label "woke" needs to be dropped.  The label has been hijacked over the past years and been made to be something negative in scope.  For me "woke" is nothing new.  It basically is a person being a non conformist and not accepting everything you are told.  Or not accepting a current belief or culture because the culture promotes that "truth".  In my world "woke" basically equals "just tell the truth".  If a story has two sides, tell both.  Don't tell the story that puts the dominant culture in the "good guy" seat.  If there are facts that also show the dominant culture having misbehaved or done something wrong, also tell those facts.  My example is when I was in elementary school we were taught, for some reason, a story about George Washington, the first President of the United States, having chopped down a cherry tree, and then he would not lie about the fact that he did so.  But, we were not taught that George Washington also owned human beings, black people, as slaves, as property.  So why is the story about Washington not telling a lie more educational than the fact that he owned slaves?  Is telling a lie somehow more important than the character it takes to own human beings as property and to treat them as property?

We have to admit that "the truth" is continuously kept from us.  Think about what you were taught in school.  What you hear on the news.  What you read in digital media.  What you read represents the decisions of one or more persons on what they want you to think about an issue.  So do you believe everything you hear or read? What have you read that presents two different sides of a story?  That is basically what "woke" equates to in my world.  Acknowledging that there is another side of a story that you have not been told.  You should ask yourself "why are we not being told the other side of the story?"  Then let those overall "facts" speak for themselves.

Over eighteen years ago I read the book "The Earth Shall Weep: A history of Native America" by James Wilson.  The book described how many Native Americans were sent to "schools" to be taught to become a "civilized" European American.  These were youth who were stripped of any Native American cultural attributes they had learned.  The schools changed their appearance, taught them to eat what European Americans ate, and educated them to what facts were important to European Americans.  The Native American culture was ignored and they lost all knowledge of who they really were.  In some ways black people in the United States have voluntarily chosen to become African Americans by adopting the values and culture of European Americans.  Many of us wear clothes that the European American culture values.  We have learned to love to drink the beer that the European American culture values.  We adopt everything that the European American culture has assigned value to.  Now many of us have become "woke" and are starting to reject these values and the culture.  We are beginning to search for our truth and our real heritiage, whether it be tied to countries in Africa or elsewhere.  I am one of those who began my search over 10 years ago.

I never have felt comfortable with the European American values and culture.  It never seemed natural.  Especially considering how this country has supported racist ideologies ever since its inception.  It is a fact that the United States government condoned the practice of slavery.  It is a fact that "white people", European Americans, enslaved black people to do the difficult work that European Americans refused to do, or were too lazy to do.  If is a fact that a political system of "Jim Crow" laws were put into place after Reconstruction, to re enslave black people.  Laws and even the highest court in the United States condoned racism for years.

So what is woke?  It is simply thinking for yourself and understanding that the truth on many issues has been kept from you.  So a "woke" person is simpley pursuing sharing of all the components of "the truth" from which an individual can then make a truly educated decision.   There was a scene shown towards the end of Spike Lee's movie, "School Daze".  One of the characters is shown outside one morning exorting everyone to "wake up".  That is what "woke" is.  Asking everyone to wake up and stop letting the dominant culture control your thinking.  Think for yourselves after examining all the evidence.

Being "woke" is not limited to one race or gender.  "Woke" is an awareness that we as human beings have many faults, one being the lack of the ability to accept information that does not present us in a positive light.  Yes, it is time to wake up.  And beware of anyone who tells you the truth has already been told.

My parents named me Arnell.  The enslavement name assigned to my ancestors in America was Hill.  As shown via a DNA test through African Ancestry, as a minimum I am a descendant of the Balanta people of what is now Guinea-Bissau and the Eshira/Eviya people of what is now the country of Gabon, both in Africa.  That is my wokeness my truth.  What is yours?



No comments:

Post a Comment