this post was submitted on 14 Aug 2023
116 points (84.9% liked)

Android

28184 readers
336 users here now

DROID DOES

Welcome to the droidymcdroidface-iest, Lemmyest (Lemmiest), test, bestest, phoniest, pluckiest, snarkiest, and spiciest Android community on Lemmy (Do not respond)! Here you can participate in amazing discussions and events relating to all things Android.

The rules for posting and commenting, besides the rules defined here for lemmy.world, are as follows:

Rules


1. All posts must be relevant to Android devices/operating system.


2. Posts cannot be illegal or NSFW material.


3. No spam, self promotion, or upvote farming. Sources engaging in these behavior will be added to the Blacklist.


4. Non-whitelisted bots will be banned.


5. Engage respectfully: Harassment, flamebaiting, bad faith engagement, or agenda posting will result in your posts being removed. Excessive violations will result in temporary or permanent ban, depending on severity.


6. Memes are not allowed to be posts, but are allowed in the comments.


7. Posts from clickbait sources are heavily discouraged. Please de-clickbait titles if it needs to be submitted.


8. Submission statements of any length composed of your own thoughts inside the post text field are mandatory for any microblog posts, and are optional but recommended for article/image/video posts.


Community Resources:


We are Android girls*,

In our Lemmy.world.

The back is plastic,

It's fantastic.

*Well, not just girls: people of all gender identities are welcomed here.


Our Partner Communities:

[email protected]


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] synapse1278 28 points 1 year ago (6 children)

Just curious, is 24GB of RAM in a smartphone useful for anything?

[–] 46_and_2 11 points 1 year ago (1 children)

Doubt it, I don't use so much even on my gaming PC.

[–] [email protected] 10 points 1 year ago

It will let you run more advanced local AI. I'm looking forward to running private LLMs.

[–] [email protected] 8 points 1 year ago

The only reason I can think of is for more on device ai. LLMs like ChatGPT are extremely greedy when it comes down to RAM. There are some optimizations that squeeze them into a smaller memory footprint at the expense of accuracy/capability. Even some of the best phones out there today are barely capable of running a stripped down generative ai. When they do, the output is nowhere near as good as when it is run in an uncompressed mode on a server.

[–] mikehunt 6 points 1 year ago

Mostly caching I guess, so less cold starting of apps

[–] [email protected] 6 points 1 year ago

For the user? Not at all. For the companies that want their spying/tracking apps to run and take your precious data 24/7? Yes, this way dozens of apps can track you even if you open a hundred more afterwards and forget about them, they can live forever deep down those 24gb

[–] OfficerBribe 6 points 1 year ago

It will allow future developers to create even less optimized apps and not worry about how resources are used.