What is with this taboo around women and money? It’s like an unwritten rule that we’re not allowed to seek it out or even talk about it. I don’t get it. Our society, especially the capitalist “utopia” that is the US, revolves around it.
It feels as if this weird ass taboo about women and money was put into place as a way to control women. The people in power have the cash. You keep women from desiring or even talking about money, you keep them from gaining power. You keep them financially dependant upon the men in their lives, whether it be their fathers, boyfriends, or husbands.
I hate this taboo. I think it needs to go die in the flames of Mt Doom to be completely honest. It’s why I go out of my way to discuss money and the payment of women for the services we provide. It’s something I suggest more women do. While we’re living in a capitalist society where money grants you access to pretty much everything, we need to start actively seeking it out.