If I were to leave my current employer and work for a competitor, I would easily get a 20% increase in my base salary, more vacation days and a signing bonus. So why don't I? Because I'm treated really well, not overworked and I like my boss and the culture of my company. My job doesn't ever get stale either, when I want to expand my responsibilities my boss is always open to it and totally supportive.

I've been in the opposite situation before and it was horrible. I feel like there is more to a job than the pay!

Does anyone feel differently? I feel like in our society so much emphasis is put on the total amount of $ you make, not the overall benefits of a job.