this post was submitted on 15 Sep 2024
263 points (89.7% liked)

Technology

59664 readers
3278 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Hard to believe it's been 24 years since Y2K (2000) And it feels like we've come such a long way, but this decade started off very poorly with one of the worst pandemics the modern world has ever seen, and technology in general is looking very bleak in several ways

I'm a PC gamer, and it looks like things are stagnating massively in our space. So many gaming companies are incapable of putting out a successful AAA title because people are either too poor, don't want to play a live service AAA disaster like every single one that has been released lately, Call of Duty, battlefield, anything electronic arts or Ubisoft puts out is almost entirely a failure or undersales. So many gaming studios have been shuttered and are being shuttered, Microsoft is basically one member of an oligopoly with Sony and a couple other companies.

Hardware is stagnating. Nvidia is putting on the brakes for developing their next line of GPUs, we're not going to see huge gains in performance anymore because AMD isn't caught up yet and they have no reason to innovate. So they are just going to sell their next line of cards for $1,500 a pop for the top ones, with 10% increase in performance rather than 50 or 60% like we really need. We still don't have the capability to play games in full native 4K 144 Hertz. That's at least a decade away

Virtual reality is on the verge of collapse because meta is basically the only real player in that space, they have a monopoly with them and valve index, pico from China is on the verge of developing something incredible as well, and Apple just revealed a mixed reality headset but the price is so extraordinary that barely anyone has it so use isn't very widespread. We're again a decade away from seeing anything really substantial in terms of performance

Artificial intelligence is really, really fucking things up in general and the discussions about AI look almost as bad as the news about the latest election in the USA. It's so clowny and ridiculous and over-the-top hearing any news about AI. The latest news is that open AI is going to go from a non-profit to a for-profit company after they promised they were operating for the good of humanity and broke countless laws stealing copyrighted information, supposedly for the public good, but now they're just going to snap their fingers and morph into a for-profit company. So they can just basically steal anything they want that's copyrighted, but claim it's for the public good, and then randomly swap to a for-profit model. Doesn't make any sense and just looks like they're going to be a vessel for widespread economic poverty...

It just seems like there's a lot of bubbles that are about to burst all at the same time, like I don't see how things are going to possibly get better for a while now?

you are viewing a single comment's thread
view the rest of the comments
[–] j4k3 37 points 2 months ago (4 children)

Mainstream is about to collapse. The exploitation nonsense is faltering. Open source is emerging as the only legitimate player.

Nvidia is just playing conservative because it was massively overvalued by the market. The GPU use for AI is a stopover hack until hardware can be developed from scratch. The real life cycle of hardware is 10 years from initial idea to first consumer availability. The issue with the CPU in AI is quite simple. It will be solved in a future iteration, and this means the GPU will get relegated back to graphics or it might even become redundant entirely. Once upon a time the CPU needed a math coprocessor to handle floating point precision. That experiment failed. It proved that a general monolithic solution is far more successful. No data center operator wants two types of processors for dedicated workloads when one type can accomplish nearly the same task. The CPU must be restructured for a wider bandwidth memory cache. This will likely require slower thread speeds overall, but it is the most likely solution in the long term. Solving this issue is likely to accompany more threading parallelism and therefore has the potential to render the GPU redundant in favor of a broader range of CPU scaling.

Human persistence of vision is not capable of matching higher speeds that are ultimately only marketing. The hardware will likely never support this stuff because no billionaire is putting up the funding to back up the marketing with tangible hardware investments. .. IMO.

Neo Feudalism is well worth abandoning. Most of us are entirely uninterested in this business model. I have zero faith in the present market. I have AAA capable hardware for AI. I play and mod open source games. I could easily be a customer in this space, but there are no game manufacturers. I do not make compromises in ownership. If I buy a product, my terms of purchase are full ownership with no strings attached whatsoever. I don't care about what everyone else does. I am not for sale and I will not sell myself for anyone's legalise nonsense or pay ownership costs to rent from some neo feudal overlord.

[–] Chocrates 19 points 2 months ago (4 children)

Mainstream is about to collapse. The exploitation nonsense is faltering. Open source is emerging as the only legitimate player.

I'm a die hard open source fan but that still feels like a stretch. I remember 10 years ago we were theorizing that windows would get out of the os business and just be a shell over a unix kernel, and that never made it anywhere.

[–] [email protected] 4 points 2 months ago (1 children)

I don't think that is necessarily out of the running yet. OS development is expensive and low profit. Commodification may be inevitable. Control of the shell and GUI, where they can push advertisements and shovelware and telemetry on you, that is profitable.

So in 20 years, 50? I predict proprietary OSes will die out eventually, balance of probability.

[–] Chocrates 2 points 2 months ago

I'm with you in the long term.

I am curious what kernel is backing the computers on the stuff SpaceX is doing. I've never seen their consoles but I am guessing we are closer to modern reusable hardware and software than we were before. When niche applications like that keep getting more diverse, i bet we will get more open specifications so everything can work together.
But again I am more pessimistic and think 50 years would be relatively early for something like that.

[–] [email protected] 1 points 2 months ago

That's probably closer today than it was then. The added complication being that client is probably not thin enough for them to return to mainframe model which would be vastly easier to monetize.

Besides we got WSL out of the bargain, so at least inter op isn't a reverse engineering job. Its poetically the reason linux ended up killing the last few win sever shops I knew. Why bother running win sever x just to run apache under linux. Why bother with hyper v when you can pull a whole docker image.

If the fortune 500 execs are sold on microsoft ita mostly as a complicated contactual absolution of cyber security blame.

[–] rottingleaf 1 points 2 months ago (2 children)

It remained in the OS business to the extent that is required for the malware business.

Also NT is not a bad OS (except for being closed, proprietary and probably messy by now). The Windows subsystem over it would suck just as bad if it would run on something Unix.

[–] [email protected] 1 points 2 months ago (1 children)
[–] rottingleaf 1 points 2 months ago

Well, was spyware-ridden Kazaa malware?

I mean, I agree.

[–] Chocrates 1 points 2 months ago (1 children)

Yeah, I guess in my fantasy I was Assuming that windows would do a full rewrote and adopt the unix abi, but I know that wouldn't happen.

[–] rottingleaf 1 points 2 months ago

They have a few legacy things working in their favor. Hardware compatibility is one, but seems to be a thing of the past now when people don't care. Application compatibility is another, and that is with Windows, not with NT.

And they don't have to change the core parts, because NT is fine. Windows is not, it's a heap of legacy, but it's not realistically replaceable.

Unless they develop from scratch a new subsystem, like Embrasures or Walls or Bars, and gradually deprecate Windows. Doesn't seem very realistic too, but if they still were a software company and not a malware company, they'd probably start doing this sometime about now.

[–] solomon42069 1 points 2 months ago

I think the games industry will start to use open source tools like Blender and Godot more and more. These options have really matured over the years and compete on features and productivity with commercial options.

From a business POV - open source makes a lot of sense when you need a guarantee your investment won't evaporate because a vendor has cancelled a feature or API your game uses. With open source, if you don't like a path the upstream code is taking you can fork off and make your own!

Part of the dynamic is also how people are inspired and learning skills. You can learn how to do very advanced stuff in Blender for free on Youtube - why would you pay some private college thousands of dollars to learn an expensive program like Maya to do the same thing?

[–] [email protected] 6 points 2 months ago (1 children)

AI still needs a lot of parallelism but has low latency requirements. That makes it ideal for a large expansion card instead of putting it directly on the CPU die.

[–] j4k3 8 points 2 months ago

Multi threading is parallelism and is poised to scale to a similar factor, the primary issue is simply getting tensors in and out of the ALU. Good enough is the engineering game. Having massive chunks of silicon laying around without use are a mach more serious problem. At the present, the choke point is not the parallelism of the math but actually the L2 to L1 bus width and cycle timing. The ALU can handle the issue. The AVX instruction set is capable of loading 512 bit wide words in a single instruction, the problem is just getting these in and out in larger volume.

I speculate that the only reason this has not been done already is because pretty much because of the marketability of single thread speeds. Present thread speeds are insane and well into the radio realm of black magic bearded nude virgins wizardry. I don't think it is possible to make these bus widths wider and maintain the thread speeds because it has too many LCR consequences. I mean, at around 5 GHz the concept of wire connections and gaps as insulators is a fallacy when capacitive coupling can make connections across all small gaps.

Personally, I think this is a problem that will take on a whole new architectural solution. It is anyone's game unlike any other time since the late 1970's. It will likely be the beginning of the real RISC-V age and the death of x86. We are presently at the age of the 20+ thread CPU. If a redesign can make a 50-500 logical core CPU slower for single thread speeds but capable of all workloads, I think it will dominate easily. Choosing the appropriate CPU model will become much more relevant.

[–] [email protected] 3 points 2 months ago

I do not make compromises in ownership.

preach!

At the end of the day though proper change will only come once the critical mass aligns on this issues along few others.

Political process is too captured for peasant to affect any change, we have more power voting with our money as customers, at least for now.

[–] sturlabragason 2 points 2 months ago