Do Hollywood Films Truly Reflect Life in America?…Yes—You Asked Series
They reflect the spirit of life in America rather than the everyday facts of American life. Hollywood’s purpose has always been to entertain. While the details of American life may not be accurately portrayed, many Hollywood films do capture the nation’s energy, restlessness and longing.
(Pamphlet, 2 pgs.)
PDF – web English (543 KB)