Some U.S. employees are not sure whether they want their employers to help them improve their health.
Researchers at the Employee Benefit Research Institute, Washington, have published figures supporting that conclusion in a summary of results from a recent telephone survey of 1,000 U.S. residents ages 21 and over.