Sep 13, 2004 18:53
I was in class today, and we were talking about some shitty subject, which can be a fun debate, but I blew the class out of the water, and they didn't even realize it was me. Muahahhahahaha
I was just curious as to what some 'more educated' people had to say about the subject.
It's all about ethics.
Ok, here it goes.
Should some people in some professions have to take tests for AIDS? Should some professionals not be allowed to practice their profession if they have AIDS? Is it the companies right to ask them whether or not an employee has AIDS or not?!
Another side of this is, do you think that when someone takes a job that has a high risk of them possibally contracting AIDS through the job, and if they do get it, is this just one of the risks, or should they have the right to refuse 'customers' who have AIDS, and should they be able to ask before hand? I'm thinking about doctors here.
I was just wondering what your thoughts were on the subject.
Whurd