While the Affordable Care Act has been in the news for several years now, many employers are still asking a rather fundamental question, which is: Do I have to provide health insurance coverage to my employees? The short answer is no. The Affordable Care Act does not require any employer to offer or continue to offer health insurance to its employees. However, employers of all sizes need to understand the real fundamental changes under the Act that go into effect in 2014, and how those changes will affect an employer’s decision to offer or not offer health insurance coverage.
[ Read More ]