Jan 16, 2006 21:34
Do wounds really heal?
--A friend of mine was accidentally shot in the finger with a BB gun when she was little, and sometimes she still feels pain there.
--And what about broken bones, are they ever really whole again? I have this fear about my foot that even though it's almost better, and soon I'll be frolicking about like I used to, that it will always be weaker than it was before. And it will never really be the same again.
--On a TV show (Grey's Anatomy maybe?) a woman had a minor heartache every year on the day her lover died.
So does time really heal wounds? Or do they just sort of fade away, but stay as with us like scars forever?