This article repeats some of the misinformation about the humanities in decline, but it does get this right: employers currently expect college grads to be prepared for the workplace right out of college. The old idea of on-the-job training is now called an internship, and students need to gain this experience while in college in order to have a shot a job after.