Posted by

Today’s Question..
Why should employers be responsibile for providing health insurance for their employees?? That is what the Obama administration is trying to do now. Making it mandatory for employers to do this.
Why do we see health insurance differently than life insurance or auto or home insurance? We do not ask our employers to pay that??