
So I'm reading a lot of stuff lately about organic food, and how good it is for you, how important to make life changes, blah blah blah.
My question is this: Why should I have to pay more for food to be grown the way it is supposed to be grown in the first place? I read a story a couple of years back about people getting e.coli poisoning from green onions that were tainted with soiled water in the fields. These were later sold, and got into salsa that the people ate.
I imagine a conversation something like this:
Veggie Guy: "Well, ma'am, we have two types of vegetables available for you to purchase. They both look green and healthy, but if you want the ones that did not get irrigated with water from cow feces, it's gonna cost you a couple of bucks more. So, which will it be? Fecal veggies or non-fecal veggies? Your choice.
Me: "Um, I guess the fecal veggies. The organic ones just seem to cost so much more..."
No, I don't think so. So now I have to go be a health nut and start shopping at the local Whole Foods store so I don't have to consume poop in my food, along with pesticides and other non-tasty comestibles. That sucks, if you'll pardon my scatological language. (Scat pun intended.)
I have a crazy idea! Why don't America's farmers get some of their pride back and just quit selling crap that shouldn't be broken down by human bodies? I don't want a bunch of rainbow-colored additives in my family's food. I just want to feed the little buggers a couple of meals a day that won't cause them to get cancer when they're 40. Is this too much to ask?
Until then, does anyone know how in the h**l to cook amaranth or quinoa?
No comments:
Post a Comment