Insurance companies are key actors of the American economy, hedging risks and covering the costs of accidents. Present throughout the country and abroad, US insurance companies are involved in life and health insurance, property and casualty insurance, business and commercial insurance, and reinsurance.