Legal Question in Business Law in Utah

insurance in the workplace

Are bosses legally required to give Health Insuarance to full time employees?


Asked on 1/28/09, 6:55 pm

1 Answer from Attorneys

Re: insurance in the workplace

No, employers are not legally required to offer health insurance to employees. Most larger employers to offer some degree as an employee benefit, but employee benefits are entirely optional.

Read more
Answered on 1/28/09, 10:19 pm


Related Questions & Answers

More Business Law questions and answers in Utah