I'm Sick of Religion influencing people. Religion doesn't tell us right from wrong, we're born with that knowledge. I am an Atheist, and my veiw has always been that religion undermines the concept of divinity. It's fine to beleive in god, but I mean, all indoctrinating religion does is give "god" an opinion. If god does exist, his opinion on everything must be that it should exist, otherwise it WOULDN'T. All religious opinions are those of it's interpreters. And I'm fucking sick of admiting that i'm an atheist and people go "Oh, i'm sorry..." like i said i had my leg cut off. RELIGION IS NOT IMPORTANT. My behavior is perfectly acceptable, and i have been an atheist since i was able to think independantly. All it does is make people think that someone prefers them to other people. And i hate it.