Hollywood played a significant role during World War II, helping America form strong opinions about the enemy, encouraging unity within her own shores and building a solid support for the military. Dr ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results