Nazi Germany Declares War on USA

Nazi Germany Declares War on USA

Do you know why Nazi Germany declared war on the United Sates of America on December 11, 1941? Hitler did so thinking that Japan would reciprocate and declare war on the Soviet Union, which would have forced them to fight the war on two fronts. But, the Japanese wartime authorities strung Hitler along for a while on the promise, and then failed to follow through.