this post was submitted on 05 Mar 2025
63 points (80.6% liked)

Technology

65086 readers
5187 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

512gb of unified memory is insane. The price will be outrageous but for AI enthusiasts it will probably be worth it.

top 32 comments
sorted by: hot top controversial new old
[–] ceiphas 36 points 4 days ago

taking Apple prices to a new extreme

[–] JoeKrogan 20 points 4 days ago* (last edited 4 days ago) (3 children)

Well this news means there will be cheaper second hand m1 and m2 machines on the market.

[–] [email protected] 12 points 3 days ago (3 children)

my college buddy and startup cofounder had a pathetically slow old laptop. he asks me the other day, "should i buy an ipad pro?" i was dumbfounded. bro you don’t even have a proper computer. we went around a bunch and he kept trying to get really bad ones like a base model mac mini. finally i persuaded him to get a 16" M1 Pro for a grand (about 700 after his trade in) and he couldn't be happier.

I'm still using my M1 MBP like 4 years later. Don’t even care to upgrade! these things are great value

[–] [email protected] 5 points 3 days ago

I thought a few days ago that my "new" laptop (M2 Pro MBP) is now almost 2 years old. The damn thing still feels new.

I really dislike Apple but the Apple Silicon processors are so worth it to me. The performance-battery life combination is ridiculously good.

[–] [email protected] 2 points 3 days ago

M2 user here. It is wonderful. You cannot get it to even heat up.

[–] [email protected] 1 points 3 days ago

Honestly, the base level M1 mini is still one hell of a computer. I'm typing this on one right now, complete with only 8gb RAM, and it hasn't yet felt in any way underpowered.

Encoded some flac files to m4a with XLD this morning. 16 files totalling 450mb; it took 10 seconds to complete. With my work flows I can't imagine needing much more power than that.

[–] [email protected] 3 points 3 days ago

Unfortunately that market is already flooded with functionally-useless 8GB machines.

[–] helpImTrappedOnline 3 points 2 days ago* (last edited 2 days ago)

The storage prices are insane. It's over 9 thousand to get the model with 512GB RAM, and it still only has 1TB of probably non removable internal storage.

2TB is +$400 4TB is +$1000 8TB is +$2200 16TB + $4600

They're saying 8TB is worth more than the entire base model Mac Studio at 2k.

For those prices I expect a RAID 5 or 6 system built in, god knows they have the processor for it.

[–] rdri 14 points 3 days ago (1 children)

can be configured up to 512GB, or over half a terabyte.

Are you ok mate?

[–] [email protected] 6 points 3 days ago (2 children)

They're not wrong. 1000 GB is a terabyte, so 512 GB is over half a terabyte.

It's exactly half a tebibyte though.

[–] rottingleaf 8 points 3 days ago (2 children)

That's a retcon of hardware producers using measurement units confusion to advertise less as more.

It's nice to have consistent units naming, but when the industry has existed for a long enough time with the old ones, seems intentional harm for profit.

[–] [email protected] 4 points 3 days ago (1 children)

That's not a retcon. Manufacturers were super inconsistent with using it, so we standardized the terminology. For floppy disks were advertised as 1.44MB, but have an actual capacity of 1440 KiB, which is 1.47 MB or 1.41 MiB.

The standardization goes back to 1999 when the IEC officially adopted and published that standard.

There was a federal lawsuit on the matter in California in 2020 that agreed with the IEC terminology.

All of this was taken from this Wikipedia article if you'd like to read more. Since we have common usage, standards going back almost 30 years, and a federal US lawsuit all confirming the terminology difference between binary and decimal units, it really doesn't seem like a retcon.

[–] rottingleaf 4 points 3 days ago (1 children)

OK, fine, all the world might say whatever it wants, but my units are powers of 2.

[–] [email protected] 1 points 3 days ago (1 children)

I prefer it too, but just because "gibibyte" is a stupid word doesn't mean it's fine to go against standards.

[–] [email protected] 3 points 3 days ago (1 children)

Agreed, but do you pick the de-facto standard of the entire industry (minus storage advertising) or the de joure standard of an outside body that has made a very slight headway into a very resistant industry.

The reality is that people will be confused no matter what you do, but at least less people will be confused if you ignore the mibibyte, because less people have even heard of it

[–] [email protected] 1 points 3 days ago* (last edited 3 days ago) (1 children)

You pick neither, and enforce correct usage of both in advertised products. Tech people will adapt, and non-tech people will be confused regardless (they still confuse megabytes/sec and megabits/sec, and that's an 8x difference).

[–] [email protected] 2 points 3 days ago

Agreed, I’d be entirely fine with legal enforcement of the ISO definitions in advertising, no need to air historical dirty laundry outside the profession

[–] [email protected] 2 points 3 days ago

How is it a retcon? The use of giga- as a prefix for 10^9^ has been in use as part of the metric system since 1960. I don’t think anyone in the fledgeling computer industry was talking about giga- or mega- anything at that time. The use of mega- as a prefix for 10^6^ has been in use since 1873, over 60 years before Claude Shannon even came up with the concept of a digital computer.

if anything, the use of mega- and giga- to mean 1024 is a retcon over previous usage.

[–] [email protected] 2 points 3 days ago (1 children)

512 GiB is half a tebibyte. 512 GB is just under 477 GiB.

[–] [email protected] 2 points 3 days ago (1 children)

Yup.

  • 512 GB > 1TB/2 - what article claims
  • 512 GiB = 1 TiB/2 - what many assume
  • don't mix GiB and GB
[–] [email protected] 2 points 3 days ago (1 children)

Correct. But that means 512 GB is not half a tebibyte.

[–] [email protected] 1 points 3 days ago

Ah, correct. RAM used GiB, so I guess I implicitly made the switch.

[–] [email protected] 3 points 3 days ago

Weird that my mind just read that as MKUltra.

Maybe appropriate for AI.

[–] [email protected] 3 points 3 days ago (2 children)

Isn't unified memory terrible for AI tho? I kind of doubt it even has bandwidth of a 5 years old vram.

[–] KingRandomGuy 2 points 2 days ago

This type of thing is mostly used for inference with extremely large models, where a single GPU will have far too little VRAM to even load a model into memory. I doubt people are expecting this to perform particularly fast, they just want to get a model to run at all.

[–] KoalaUnknown 4 points 3 days ago

While DDR7 DRAM is obviously better, the massive amount of memory can be a massive advantage for some models.

[–] alekwithak 3 points 4 days ago

That extreme's name? Albert Einstein.

[–] [email protected] 1 points 3 days ago* (last edited 3 days ago) (1 children)

Is memory that small, connected externally, or does that SoC just end up being a large package, with that much RAM on it?

[–] [email protected] 2 points 3 days ago (1 children)

It's just external and soldered to the motherboard on Macs, no?

[–] chonglibloodsport 3 points 2 days ago

No, the ram is integrated into the CPU.