Ok then. I'll echo what some others are saying about 16GB being sufficient. If you were in engineering every now and then it's not enough but I don't think its the case for comp sci. I'd leave the door open and get one in which you can upgrade the RAM though.
One thing to look out for is CPU performance. I find the laptop CPU market is a disaster right now in which you really don't know what you'll get. LTT has a recent video on the topic. For most courses it won't actually matter that much. Some examples of the ones where it could make a difference are numerical linear algebra courses, machine learning (classical, not neural networks), and computer vision (again, classical). In some of these extra RAM might also be helpful but I'd prioritize a better CPU over the RAM. You may look at CPU benchmarks to get an idea of their performance.
In terms of GPU... I don't think you'll get anything capable enough for training neural networks at this price point, which is the only thing you may need it for in comp sci. But it'll help with light gaming (but I imagine integrated graphics is good enough for minecraft these days—but dont quote me on that).
Also lastly, I would still recommend finding something with decent Linux support even if you dont want to use it (yet), you may choose to install it down the line. My Dell XPS/Precision has pretty poor linux support with buggy trackpad issues which has caused issues for me in the past. Many comp sci students end up switching to Linux/dual booting for a good reason.
I expect to be able to code in it
What kind of student? Computer science? Engineering?
I agree very disappointing. I find it increasingly difficult since Twitter closed its doors to those signed in as my account is long deleted.
Nice news. Money only starts flowing in 2 years though and then it takes many many years for stuff to get built on top of that. Wish we could pick the pace up on these issues.
Great read. Even in STEM research as a grad student I'm very tired of every saying "let's try machine learning on this problem" to get something that works marginally better than some conventional models but requiring huge amounts of computation and data.
Paywalled and not on archive :/
Sorry for super oversharpened photo. It's like 30x zoom.
Nonetheless it's gotten quite expensive. It made a nice post-bar snack, however there are now many better options at the price point.
Some people think they are good for brain development. I've met some parents who gave their young kids a few pills every day hoping it'll help make them smarter.
Hope to see it on GOG. But maybe not since there is multiplayer.
But AMD would rather sell two cards at 1000 each than take the bet of trying to sell four at 750.
At the same time though this might not be unreasonable. I don't know what the profit margin on these cards is given the R&D, manufacturing costs, and other various overheads, but it might be WAY more worth it to sell two at 1000 then four at 750. Might even be worth it to sell one at 1000 vs four at 750 depending on how slim it is after all those costs.