I've got a character who's a black psychiatrist in America during the 1970s, so I'd appreciate it if people could rec any good books that discuss: 1) what it was like to be a black medical professional around that time, and 2) what American psychiatry was like through the 50s to the 80s.
I'm not American, so American race relations and psychiatry aren't my forte, but
Black Psychiatrists and American Psychiatry seems like a really obvious place to start.