Honestly it was not trivial, the custom emojis in the markdown parser seems to be vulnerable. Of course everything should be sanitized, but in practice there are cases where it's hard to make a proper sanitization while retaining features to let users write weird stuff. This is not the case of "validate a username" that you know very well which regex to use and which character space.
I would actually say that this vulnerability should have been prevented using proper cookie security, which should make it impossible to steal the session via XSS.
I do acknowledge though that it's not easy to take care of all of this when it's 2 people working on everything (from design to frontend, passing for deployment etc.), especially if there are no specific competencies in appsec.
I find some of the information in the article a little bit misleading when it comes to the GDPR. First, the US is considered an unsafe country into which transferring data, because of reasons such as the Cloud Act. The GDPR simply prescribes that data sharing is possible withing countries that offer guarantees similar to the ones offered by the GDPR. If US doesn't want to offer guarantees of data protection, I don't understand why the limitation should be perceived on the regulator side.
Second, each platform has in any case multiple instances deployed, often each instance is also distributed across regions etc. Painting this as an "engineering nightmare" seems debatable.
I am also not aware of any prescription that forced anybody to run everything in the US with a US company (which therefore is subject to the cloud act). If the companies make these choices, to me it seems natural that they should also take responsibility for them, not complain that known regulations that exist for years forbid you to do stuff...