Skip to main content

Hitler in Hollywood (WT)

since 2024

“Hitler in Hollywood" (WT) sheds light on the complex relationship between Hollywood and Nazi Germany in the 1930s. The documentary shows how the film industry missed long opportunities for decisive resistance against Nazi Germany through self-censorship, collaboration with the Nazis and the influence of American fascists.