I think the word "diet" itself has become such an ugly word. It's defined as the foods that a person consumes on a regular basis but, somehow, it has become associated with restriction and weight loss. I never claim to be on a diet, & it bothers me when people say that I am. I'm not "dieting," I eat healthy, nutritious, minimally processed foods. I think that healthy eating is definitely trendy right now, for better or for worse. With diets like clean eating & paleo, and chain stores like Whole Foods or Trader Joe's, I think that "health food" is becoming more popular.
I've known too many girls who had their mother's or older sister's destructive weight/body issues forced on them, causing them to be insecure, self-conscious & obsessed with staying thin. It's so sad to me that little girls are so critical of themselves and feel that they have to lose weight to have value. One day when I'm a mother, I will make sure to be very careful about how I talk about myself, my body, & food in general. I won't obsess about being healthy, but I will make sure my children understand that taking care of themselves through nutrition & exercise is part of loving themselves. I'll shut up now, because I could go on about this forever.
Last edited by Keep Moving Forward; 05-01-2013 at 01:55 AM.