The US denazified Germany by hiring all the Nazis in important government roles.
this post was submitted on 10 Jun 2023
1 points (100.0% liked)
History
1886 readers
2 users here now
founded 4 years ago
MODERATORS
It got the Nazis out of Germany, didn't it?!
When I commented I couldn't see the picture. Funny I guessed what it was.