In a country where we've stripped away certain property rights and freedom of association in an effort to right the wrongs of the past, where we give preference in every Federal/State/Local government job, preference in college admissions, and preference in hiring at every large company I'm aware of, it's simply not a credible assertion that America is "racist" or "oppresses people of color". I bet your own PD has a "diversity" initiative and actively recruits and gives hiring preference to "people of color". Black privilege, it's real, and no reasonable person is going to buy into the "America is racist" narrative. People who believe that are either throwing childish tantrums or actual racists themselves.
The very fact that these programs or laws are necessary is a clear indication that your assessment is incorrect.