Most nurses don't even learn practical skills in school, but on the job. Their grounding in theory and stuff like anatomy and what not is what they're supposed to learn, but honestly most of this is learned on the job too. Nursing education is a big joke and padded with a lot of nonsense these days just so it can be a bachelors' degree and more "professional." The old school hospital diploma schools produced by far better nurses, but I'm not sure if they even exist anymore.Looking at Twatter and Instagram you'd think nurse education mostly instills nothing but a sense of massive self-importance, arrogance, and some minor practical skills.
Also true. These are the nurses with a couple years of experience, though. Often they have to lead the doctors around. In some settings resident physicians are practically taught by nurses in terms of day to day responsibilities, the attending doctors only stepping in for the big stuff.Looking at actual nurses, though, in my experience they have a wealth of practical experience, while doctors are often a batch of giddy idiots almost incapable of normal human interaction.