I’ve noticed an uptick in government distrust - basically the FBI, DOJ, MSM, Federal government, crime stats, etc is always lying to us for nefarious reasons. Trying to purposely ruin America.
It’s fascinating to observe.
When did this become a mainstream belief?
It’s fascinating to observe.
When did this become a mainstream belief?