The United States of America was settled by Europeans who needed to push the natives off of their land either by force or passively through the spread of disease. The industry of this country wanted a form of cheap, controllable labor and since slavery wasn’t considered to be wrong back then, ...