I hear this sentiment a lot in reference to a lot of different aspects of parenting/life. The phrase comes up a lot in various blogs I read:

https://www.google.com/search?q=%22tired+of+society+telling%22

I always wonder who "society" is. Is it Television Shows and Magazines? The consensus of the General Public? Family and Friends? The medical establishment?

How do you define society... and is there anything you're tired of being told by society?