I've been highly thinking of going organic for most veggies, and buying grass fed meat. I find that these products are more expensive and harder to find, but you do pay for quality.
Just curious if anyone has gone this route and if they find it expensive, and if they feel better because of having gone organic? After reading so much about all the toxins/pesticides/antibiotics in everything we consume, I'm highly thinking of going 'organic'.
Thanks