I know lots of people who went to college and ended up never doing anything with their degrees. For example, two of my close friends went to college and became SAHMs and one of my guy friends went to an Ivy League school and is now a General Contractor (family business).

My cousin (who is a successful business owner) thinks a lot of people can benefit from taking the money you were going to use for college and opening up your business. Learning the ropes the old fashioned way.

What's your take on college? Do you think it's worth it for the educational component? Or is it not always necessary to become successful in life?