Hollywood played a significant role during World War II, helping America form strong opinions about the enemy, encouraging unity within her own shores and building a solid support for the military. Dr ...