Why did America stay neutral? Was it too much of a risk? Was FDR just being submissive to public opinion for his chances in 1940?

I know that in the United States, neutrality in the “European Conflict” was a divisive issue back in 1939. But, looking back at the events of the last two months, I’m still baffled as to why the Americans never declared war on Germany for the City of Flint incident. Was it too much of a risk? Was FDR just being submissive to public opinion for his chances in 1940?

1 Like