this post was submitted on 11 Oct 2024
5 points (77.8% liked)

AskHistorians

714 readers
2 users here now

founded 2 years ago
MODERATORS
top 1 comments
sorted by: hot top controversial new old
[–] rustyfish 10 points 2 months ago* (last edited 2 months ago)

WW2 is taught very early on in German schools. Depending on what kind of school you are it might even dominate history class. Things like the 30 Years War, the Weimar Republic, WW1 and the reunification of East and West Germany are also taught, but not in the slightest as much as WW2. My history books back then were filled with pictures from concentration camps. It has been made pretty clear who started that shit, how it has been started and what went on.

As for Japan I can only give second hand accounts. Usually they teach until the years before WW2 and suddenly they are in it. Nothing about the fascist takeover, no mention of the Kempeitai. There is a lack of information about the how and the crimes committed by their soldiers. Some of the worst monsters from WW2 are praised as innocent people in their own shrines. To this day the government plays the victim card very hard and all around the country people find out about the truth themselves years later. There are movements to change that and actually teach the truth, but they are relatively small.