Are AI models then going to train on the content they themselves created? What is the impact of that?
It leads to model collapse. The second AI starts to focuses on certain patterns in the output of the first AI instead of the actual content and you get degraded output. They are pattern matching machines after all. Repeat the cycle a few times and all output becomes gibberish. Think of it as data incest.
So the AI companies are pretty desperate for more fresh user data. More data is the only way they have currently to push through the diminishing returns.
What are platforms going to do about it? Start to demonetize AI generated videos and ban AI written fan fiction?