United States Marine Corps

History
The United States Marine Corps is a military branch of the U.S. Government. They are primarily men and women fighting on the ground to protect America from foreign threats.