Body Positive Books That Don't Feed Into Diet Culture or Toxic Habits

As body positivity, body neutrality and fat liberation make their way into the cultural lexicon of instagram captions and mainstream magazines, it’s really easy to see bits of diet culture sneak their way in and continue to focus on weight loss and shrinking a body more than how to properly nourish the person who lives in said body. Not cool, we hate to see it.

Related story

This Charcoal Lemonade Cleanser Will Help Manage Your Breakouts — It's Currently 35% Off

That’s why it’s great that we’re seeing more and more material that kicks aside those pre-conceived notions about health and fitness and instead focus on the ways we can actually radically love our bodies and make the world more inclusive for every kind of body. If your bookshelf is offering slim pickings on books that make you feel good and empowered about the skin you’re in, look no further, we’ve got a grown-up summer reading list available to help you consider all the ways your body deserves a little bit more care and kindness.

Read on for books about nutrition, self-love, wellness and just rocking your best life in your body no matter what society’s obsession with thinness tries to tell you.

Our mission at SheKnows is to empower and inspire women, and we only feature products we think you’ll love as much as we do. Please note that if you purchase something by clicking on a link within this story, we may receive a small commission of the sale. 

Source: Read Full Article