It's definitely possible, since all the code for generators on Perchance is openly available and downloadable, but currently there's unfortunately no "one-click" way to do this right now - and it still would require a bit of coding knowledge at this point.
I think I wrote a comment related to this a few months back - basically you'd need to use something like ollama or llama.cpp or tabbyAPI or Aphrodite or vLLM or TGI (....etc) to run the AI text gen model (and for image gen, ComfyUI or Forge WebUI). Unfortunately even a top-of-the-line gaming GPU like a 4090 is not enough to run 70B text gen models fully in VRAM, so it may be slow. And then you'd need swap out some code in perchance.org/ai-text-plugin and perchance.org/text-to-image-plugin so that it references your localhost API instead of Perchance's server. You'd just fork the plugins, make the changes, then swap out the imports of the ai plugin for your new copies in the gens you want to self-host.
Someone in the community with some coding experience could do the work to make this easier for non-coders, and hopefully they share it in this forum if they do. I'll likely get around to implementing something eventually, but probably won't have time in the near future.