As we age, a common human experience is losing faith in the institutions we grew up believing in (i.e. family, government, economy, education, and religion). Is the American medical industry an institution we should have faith in, or not? Could it be causing unnecessary harm by promoting the invention of diseases, utilizing erroneous mental health categories, and informing its practices on funding? What are the positives of the American medical industry when compared to other countries? How do we fix the errors of this American institution to purely reflect an apolitical agenda intent on servicing those in need?