Perchance - Create a Random Text Generator
⚄︎ Perchance
This is a Lemmy Community for perchance.org, a platform for sharing and creating random text generators.
Feel free to ask for help, share your generators, and start friendly discussions at your leisure :)
This community is mainly for discussions between those who are building generators. For discussions about using generators, especially the popular AI ones, the community-led Casual Perchance forum is likely a more appropriate venue.
See this post for the Complete Guide to Posting Here on the Community!
Rules
1. Please follow the Lemmy.World instance rules.
- The full rules are posted here: (https://legal.lemmy.world/)
- User Rules: (https://legal.lemmy.world/fair-use/)
2. Be kind and friendly.
- Please be kind to others on this community (and also in general), and remember that for many people Perchance is their first experience with coding. We have members for whom English is not their first language, so please be take that into account too :)
3. Be thankful to those who try to help you.
- If you ask a question and someone has made a effort to help you out, please remember to be thankful! Even if they don't manage to help you solve your problem - remember that they're spending time out of their day to try to help a stranger :)
4. Only post about stuff related to perchance.
- Please only post about perchance related stuff like generators on it, bugs, and the site.
5. Refrain from requesting Prompts for the AI Tools.
- We would like to ask to refrain from posting here needing help specifically with prompting/achieving certain results with the AI plugins (
text-to-image-plugin
andai-text-plugin
) e.g. "What is the good prompt for X?", "How to achieve X with Y generator?" - See Perchance AI FAQ for FAQ about the AI tools.
- You can ask for help with prompting at the 'sister' community Casual Perchance, which is for more casual discussions.
- We will still be helping/answering questions about the plugins as long as it is related to building generators with them.
6. Search through the Community Before Posting.
- Please Search through the Community Posts here (and on Reddit) before posting to see if what you will post has similar post/already been posted.
view the rest of the comments
Hmm, I wasn't able to replicate this problem. I added this as character's custom code, and it does log the response in the console:
Is it perhaps header values you're missing, or something? Certainly possible that there is a big though and I'm just not hitting it with the above example for whatever reason.
Either way this motivated me to finally get around to creating a CORs-bypassing plugin: https://perchance.org/fetch-plugin That way I can ensure strong backwards-compatibility and performance instead of just relying on a little glitch.com server, which won't scale, and doesn't have good perf/uptime guarantees. I'll wait to hear back from you before integrating it into
ai-character-chat
just so we can don't end up making it harder for you to reproduce the bug you were up against here.bottom of page says "Check out more plugins at perchance.org/plugins" twice
edit: and now it doesn't. must be magic
@[email protected] @[email protected] Sorry just realized the plugin is not quite ready for prime time! Need to make some breaking changes so I've commented out the code for now. The moment I started integrating it into ai-character-chat I realised it's a bad idea to overwrite the existing fetch - should instead just be a separate "superFetch" or whatever because otherwise if you need to e.g. download a big file that you know doesn't have any CORs problems, you're forced to go through the proxy, which will be slower. Will update soon hopefully, if not tomorrow. Sorry for trouble if you'd started playing with it!
@[email protected] @[email protected] Okay,
superFetch
it is: https://perchance.org/super-fetch-plugin ... phew - happy I managed to pull the plug on that previous approach quickly, but sorry again for the breaking change! I'd only ever do when a plugin is extremely new - i.e. a few hours old, as in this case. I take breaking changes with the official plugins really seriously, so it was still somewhat painful to do.Previously it worked fine, but now it doesn't. Here's the code for the ElevenLabs API that I wrote. Here is a HAR file of the requests.
Okay, thanks for the example, I think it's all fixed now. Hopefully I didn't break anything. Been up for two days straight tho so I wouldn't bet on it, but I did some basic tests and it seems good. Will check lemmy messages first thing tomorrow 🫡
haha. one and a half days for me and maybe 30+ hours straight just now on something with your wonderful comments update :) can't wait to share it! No sleeping yet!
Same 😄
I'll complete and share the project possibly (and hopefully) as an actual plugin after the post-announcement update of my generator hub page, but I'll be releasing the "early implementations" somewhere in my experiment generator so everyone can try it right now and give some feedback on it.
It seems to work now, though there are some inconsistencies with the chunk text arrangement, which causes the text in the stream to be quite jumbled. I'm looking into it now, I'll update if it is still inconsistent with the order.
EDIT: It is inconsistent with the order of the chunks. Maybe there is a way to parse it in order? Currently I'm having it pushed into an array, then sort that array by the index, then join the sorted array to be a string before pushing it to the t2s, though it is still inconsistent and sometimes the streaming finishes and the text to be spoken is not yet queued up.
Here are the code hacks to re-sort the chunks to order lmao.
I wasn't able to reproduce this when trying it with this code:
Or have I misunderstood the problem?
You have. On trying your code. it gave me the
OUT OF ORDER CHUNKS!
:Here's the character I used with the custom code to check the out of order chunks.: Link to Character
Ahh, thank you! Was very confused at first because I iteratively made your character closer to the default Assistant to work out why it was happening in yours but not the assistant, and found that the profile pic was the cause lmao, but eventually realised that it was because data url vs normal URL, and the data url (being larger) was making an async IndexedDB request take a few milliseconds longer, which caused the out-of-order-ness. But I shouldn't have even been doing those db requests in the first place, so I've removed them, and this race condition type bug shouldn't be possible at all now. Thanks again!!
Niiceeeeeeeeeee it is good now. Thanks again!