Oct 31, 2008 14:15
Why do parents believe they have the right to force religion on their children?I saw this on another blog.
Other than tradition passed down from patriarchal societies when wives and children were considered chattel of their husbands to do with as they wished, why should modern parents believe they have unfettered rights to teach their children anything they want? Indeed what makes parents believe their children must believe the same way about politics or ethics as they do? Why not let them mature and make their own decisions? Here in the west we no longer believe we can force a woman to marry someone, or force a child to attend a certain university or take up a certain trade. Why does religion enjoy an exception to our otherwise fair minded thinking?