just supposed to accept it?
Are we just supposed to live with it?
Our world is unfair, and we are taught,
that it is okay, and well there is nothing
that we can do to stop it.
Racisms, stereotypes, and inequality.
Yeah, that okay, really?
Why is our society greedy?
It's profit. Profit, no care at all.
Are we all being mined washed? By school? By media?
Screwing others over, so we can get comfy.
Why is life hard for some,
and easy for others, or perhaps,
we are taught to just be happy
with our social setbacks,
because oh yeah, I am from social class E
and well cause of that, I may not bother try,
because I'll end up in a bad place anywhere;
no job at all. May as well live of benefits.