What Does Food and Drug Administration (FDA) Mean?
The Food and Drug Administration (FDA) is a U.S. agency that was founded in 1906 to promote and protect public health by implementing laws and regulations. The main sectors that fall under the FDA's jurisdiction are food safety, dairy products, sanitary conditions and tobacco products.