Technotica

joined 2 years ago
[–] Technotica 1 points 2 years ago
[–] Technotica 1 points 2 years ago* (last edited 2 years ago) (4 children)

Hey, tried the SVG question, this is the answer:

33B:

.....

User: How are you doing today?

Vicky: Today, I'm doing well, thank you for asking! How about you?

User: I am doing great! what can I use to draw curved text in an SVG?

Vicky: To draw curved text in an SVG, you can use the element. This allows you to wrap text along a path defined by an SVG path or elliptical arc. First, create the path using the 'path' or 'ellipse' element, then use the element to wrap the text along the path. Here's an example:

+++

html
<svg xmlns="http://www.w3.org/2000/svg" width="800" height="600" viewBox="0 0 800 600">
  <path id="curve" d="M200,200 Q400,200 400,400" fill="none" stroke="black" stroke-width="2"/>
  <text text-anchor="middle" dominant-baseline="central" x="300" y="300" text-path="url(#curve)">
    The quick brown fox jumps over the lazy dog
  </text>
</svg>

+++

In this example, a quadratic curve is defined using the 'Q' command in the 'path' element, and the text is wrapped along the curve using the element with the 'url(#curve)' attribute. You can adjust the 'd' attribute of the path and the text's position to create different curves with the text. .....

13B:

.....

User: what can I use to draw curved text in an SVG?

Vicky: In SVG, you can use the "textPath" element to create curved text. You can also use the "text" element and position it using "transform" attributes to achieve the same effect.

User:Can you give me a code example for that?

Vicky: Sure! Here's an example using the "textPath" element:

+++

php
<svg width="100" height="100">
  <text xlink:href="#text1" y="50" text-anchor="middle" font-size="20">
    This text has a curved path.
  </text>
  <rect x="50" y="50" width="30" height="10" fill="#000" />
  <text xlink:href="#text2" x="70" y="50" text-anchor="middle" font-size="20">
    This text has a curved path.
  </text>
</svg>

+++

And here's an example using the "text" element:

+++

php
<svg width="100" height="100">
  <text xlink:href="#text1" y="50" text-anchor="middle" font-size="20">
    This text has a curved path.
  </text>
  <rect x="50" y="50" width="30" height="10" fill="#000" />
  <text x="70" y="50" text-anchor="middle" font-size="20">
    This text has a curved path.
  </text>
</svg>

+++

*Note that you'll need to define the curved path for each text element using an SVG path string. In this case, the path strings are #text1 and #text2.

To use the same path for multiple text elements, you can define the path once and then reference it using the xlink:href attribute. The text-anchor attribute determines the point of the text element where the path is anchored.*

.....

(no idea why Lemmy doesn't italicize that last text part)

13B-Vicky gave more code, no idea which code example is better. 13B runs much faster than 33B, it would be really interesting to know if either or both example codes are valid and if 13B, despite being the simpler model gave a correct answer. 33B is a preview model not the final version.

What I am noticing is that neither has actually used the textPath element, though 33B used text-path, no idea if that exists.

Can anyone who knows how to write code weigh in on this?

Thanks!

[–] Technotica 1 points 2 years ago* (last edited 2 years ago) (4 children)

Well maybe it was a bit hyperbolic of me, I don't really have hard stats I can compare. But if you have a problem gpt-4 could solve then I could pose it to Vicuna-33B (and maybe 13B just for comparison) and see what its response is and post that here. I think it would be a really interesting test!

[–] Technotica 1 points 2 years ago (11 children)

Ah interesting! I guess I will try ignoring the "auto-conversation". Vicuna-33B is really good though, as eluquent in most things as what I have seen of chat-gpt so far.

[–] Technotica 17 points 2 years ago (1 children)

Sorry, we are all in the Star Wars memes section. LotR memes are next season.

[–] Technotica 9 points 2 years ago* (last edited 2 years ago) (1 children)

Ohhh.. Snacks! I was reading Snakes the whole time I was reading this thread and was really confused! (Curse my brain)

[–] Technotica 2 points 2 years ago (13 children)

Ah thanks for the illumination, I understood that there is nothing "behind" the text like a personality or intent. It makes it really clear that LLMs are just very complex chatbots, doesn`t it? But instead of just regurgitating text or writing text with a lot of nonsense like the old simpler chatbots did it can generate text far more completely.

Vicuna-13B seemed pretty good at the illusion part, it must be really optimized! I have seen llama do less impressively, you ask it about the weather and it response with what looks like an excerpt of a novel where characters talk about the weather etc. :)

The "teeing it up" is done via the prompt file right? I saw that all of the examples have a general one sentence outline that guides the LLM on how to respond (i.e. "A chat between a User and an AI, the AI is very helpful and firendly") and then a few lines of interaction in style of:

User: Text AI: Text

[–] Technotica 2 points 2 years ago* (last edited 2 years ago) (1 children)

It`s super easy, I followed this site: https://blog.gopenai.com/the-total-noobs-guide-to-harnessing-the-gpu-for-llama-inference-2e3c4fdfbc84 (You can view it without login with archive.ph) then just downloaded a quantized Vicuna model from huggingface (it works with llama and alpaca too).

Then start the dialogue with, as an example: ./main -m /path/to/model/file --repeat_penalty 1.0 --color -i -r "User:" -f /path/to prompt/file

You can see example prompt files in the llama.cpp folder and write your own short one.

I did it once on Windows but then decided to use my debian system for it, I found the process easier on Linux as you can get the required software (python etc) easier. If you want a more complete and even easier setup try this:

https://cocktailpeanut.github.io/dalai/#/

Dalai installs everything you need for you, I don't think it supports Vicuna but you can use Llama and Alpaca with it. (as in directly installing Vicuna, you can download the Vicuna model files manually and put them into the appropriate place for Dalai I am sure)

Some more guides:

https://hub.tcno.co/ai/text-ai/vicuna/

https://www.linkedin.com/pulse/step-by-step-guide-running-vicuna-13b-large-language-nischal

oh, p.s. I just learned about the --n-gpu-layers XX option, this makes the model run so much faster I can run Vicuna-33B Q5 now adequately fast with 31 of 63 layers offloaded.

[–] Technotica 1 points 2 years ago

Correct! Five cookies for you!

[–] Technotica 1 points 2 years ago

Sorry, wrong! ;)

[–] Technotica 3 points 2 years ago
view more: ‹ prev next ›