this post was submitted on 22 Sep 2023
5 points (85.7% liked)

Perchance - Create a Random Text Generator

459 readers
8 users here now

โš„๏ธŽ Perchance

This is a Lemmy Community for perchance.org, a platform for sharing and creating random text generators.

Feel free to ask for help, share your generators, and start friendly discussions at your leisure :)

This community is mainly for discussions between those who are building generators. For discussions about using generators, especially the popular AI ones, the community-led Casual Perchance forum is likely a more appropriate venue.

See this post for the Complete Guide to Posting Here on the Community!

Rules

1. Please follow the Lemmy.World instance rules.

2. Be kind and friendly.

  • Please be kind to others on this community (and also in general), and remember that for many people Perchance is their first experience with coding. We have members for whom English is not their first language, so please be take that into account too :)

3. Be thankful to those who try to help you.

  • If you ask a question and someone has made a effort to help you out, please remember to be thankful! Even if they don't manage to help you solve your problem - remember that they're spending time out of their day to try to help a stranger :)

4. Only post about stuff related to perchance.

  • Please only post about perchance related stuff like generators on it, bugs, and the site.

5. Refrain from requesting Prompts for the AI Tools.

  • We would like to ask to refrain from posting here needing help specifically with prompting/achieving certain results with the AI plugins (text-to-image-plugin and ai-text-plugin) e.g. "What is the good prompt for X?", "How to achieve X with Y generator?"
  • See Perchance AI FAQ for FAQ about the AI tools.
  • You can ask for help with prompting at the 'sister' community Casual Perchance, which is for more casual discussions.
  • We will still be helping/answering questions about the plugins as long as it is related to building generators with them.

6. Search through the Community Before Posting.

  • Please Search through the Community Posts here (and on Reddit) before posting to see if what you will post has similar post/already been posted.

founded 1 year ago
MODERATORS
 

Example generators made with this plugin:

See the plugin page for more. There will probably be issues/bugs! Thank you in advance to the pioneers who test this and report bugs/issues in these first few days/weeks ๐Ÿซก

(It was actually possible to discover this plugin a few days ago, but no one made it through all the clues lol ^^ some people did at least figure out the first step)

top 7 comments
sorted by: hot top controversial new old
[โ€“] VioneT 3 points 1 year ago (1 children)

Just made ai-text-recipes and this template for testing.

I assume that if the AI is generating multiple paragraphs, those paragraphs are 'chunks'? Also, we can also use just the onChunk() function instead of render() since both are applied on each chunk?

[โ€“] perchance 2 points 1 year ago* (last edited 1 year ago) (1 children)

Nice! Thank you for playing around with it.

The chunks are basically words, or chunks of words, but they can be larger than that. E.g. the first chunk is your startWith text if you specified that, and then each subsequent chunk is generally a little piece of text - corresponding to the chunks that are being appended to the output element several times per second.

The render function is specifically for transforming the output into some different form. Whatever you return from that function is what gets displayed - like in this example where we ask the AI for asterisks around actions (since that would be easy for it to generate) but then "render" that text so that the asterisked parts are italicized via HTML. Getting the AI itself to generate HTML is okay, but it has been trained mostly on text, rather than HTML, so it's probably better to get it to use a "syntax" that it's more accustomed to, and then we handle the transformation to HTML ourselves with render.

onChunk doesn't have any effect on the display of the output unless you specifically write some code to do that. It just allows you to run whatever custom code you want every time a new chunk is received.

But yeah you can definitely just use onChunk if you want to manage the "rendering" yourself (e.g. onChunk(data) => outputEl.innerHTML = data.fullTextSoFar.replace(...)), or if you don't want to change what is displayed, but instead want to do something else for every chunk.

Thanks for the question! I've just updated the plugin page with some details on most of the options that are currently available.

[โ€“] VioneT 3 points 1 year ago

The list on the plugin page is really helpful! Thanks again for the explanation!

[โ€“] VioneT 2 points 1 year ago (1 children)

Found the ๐Ÿฆ™ but got stuck there! Thanks for this!

[โ€“] perchance 2 points 1 year ago (1 children)

Can't wait to see what you create with this one! Your text-to-image-plugin creations (esp. realistic portraits) are amazing. Let me know if there are any extra prompt options that would make certain common use cases easier (akin to hideStartWith - which I guessed would be something people asked for, but it was just a guess)

[โ€“] VioneT 2 points 1 year ago* (last edited 1 year ago) (1 children)

Thanks! Just started testing it and some hiccups about network failure is happening.

[โ€“] perchance 2 points 1 year ago

Woops! Thanks. Should be fixed now. Please keep me updated with any other issues you run into.