Jul 02, 2005 22:21
I always have something to bitch about. But for some reason this is particularily annoying me right now.
naturalism.
Since when do people in their fuckin 21st century homes and lives get to take a vow of NATURALISM.
First of all, what is natural? People NATURALLY evolved to produce things on their own means...like medicine and such. We need to survive by nature and if medication aids that, what is unnatural about that? Or does natural just mean fucking roots and twigs and leaves? Okay, then please take of your clothes and spit out your restaurant food and take back years of washing your body with chemicals. You go find me a tree that grows those cute little plastic containers that you eat your natural vegan cookies out of. No. Naturally derrived, but what isn't? Oh yeah, nothing.
Second, HOW DO YOU JUSTIFY ONLY DOING NATURAL DRUGS? Practically everyone has this stand point. It makes no sense to me. Just because it grows from the fucking ground makes it no safer, in any way, then something that comes from a lab. "Nature wouldn't hurt me" even though hurting you is it's prime defense mechanism and large portion all plants will kill you because they're POISON.
You have a headache. so just take the fucking Advil. What logic is there in not doing something that could help you?? WE HAVE ACCUSTOMED THE KNOWLEDGE, WHY NOT USE IT? All animals are constantly developing themselves in order to survive. Why shun this natural acquirement? It makes no sense. Oh, because they're foreign things to your body? Try pollution and weed smoke (yeah it's a plant ...THAT PRODUCES TOXIC SMOKE THAT BLACKS YOUR LUNGS. WHHHHHHAT?) and half the food you probably eat.
Bottom line: who cares. the way i see it you could say either NOTHING is natural anymore or EVERYTHING is natural. I got lazy and made lame points, sorry.