Interesting concept, but the hard part is productization. I think one would be better of with Framework for that.
[email protected] (self-promotion, I am a mod there) covers semiconductor related topics pretty well.
That being said, some articles do include business and tech news.
I thought you meant overall version complexity. Yes, you can't tell the difference between 1.4 and 2.0. But this was easier to manage than the different USB versions.
HDMI was somewhat straightforward until 2.1 or so.
Lack of MMU is a big drawback for many use cases, no? That would make this more of a microcontroller type device from my (limited) understanding.
This was to be expected.
Intel said spinning out RealSense is not a direct result of the company’s recent financial struggles.
I don't buy this. If Intel was in the same position that they were in say 2014, they probably wouldn't have spun off RealSense or MobileEye.
Shipments edged up 1.8 percent to 68.9 million units in calendar Q4, with Lenovo, up 4.8 percent, outpacing the average to consolidate market share and account for almost one in every four computers bought by distributors globally.
Government subsidies in China helped to boost consumer sales locally, and end-of-year promotions tempted shoppers in Europe and the US too. Businesses continued to gradually sign off on hardware upgrades before Windows 10 support ends in October 2025, albeit not at the rate vendors expected.
This was a better finish to the year for the PC industry, which grew by just 1 percent over the 12 months to 262.7 million units.
Considering the massive hype around AI generally (which began before 2024) and AI PCs specifically, this is a rather modest level of YoY shipment growth. And key drivers identified by IDC were rather run of the mill; government subsidies in China, end of year holiday spending in Europe and the US.
Also fascinating to see Windows 10 support ending in 2025 not having a bigger impact. I have feeling governments will force Microsoft to extend public (free) security updates for at least another year if not more. I am definitely not switching to Windows 11.
Honestly, I think while they had a financial incentive, they probably also just didn't want to bother with the all the work required to add production-level support for PS3 games and and the maintenance complexity of supporting it.
It's one thing to develop a strong proof concept, there is a lot more involved with getting it to the market as commercial product (with guarantees and support).
That being said, excellent work by the folks who figured this out.
You, can install modern AMD GPUs.
https://www.jeffgeerling.com/blog/2024/amd-radeon-pro-w7700-running-on-raspberry-pi
Albeit the CPU can only really handle 720P for somewhat modern graphically intensive titles.
Raspberry Pi is IMO an excellent tool for learning Linux (particularly CLI, you honestly don't need a UI for the computer itself) in a low risk environment while also allowing you to build some really useful services (NAS, Pi-Hole, media-server).
I've been reading (and subscribing to) Ars Technica for a long time (20+ years reading, ~10 year sub).
While they have pretty solid coverage on many topics (science, US public policy, general tech), their coverage of Apple has always been very biased. The Apple fanboys in the comments are also extremely annoying and pathetic.
EDIT: Added the paragraphs in question:
The BBC stories about the error-prone AI have often seemed to lack understanding of how the Apple Intelligence notification summaries work—for example, in suggesting that all users received the offending notification about Mangione. The wording of the summaries varies on individual devices depending on what other notifications were received around the same time.
Nevertheless, it's a serious problem when the summaries misrepresent news headlines, and edge cases where this occurs are unfortunately inevitable. Apple cannot simply fix these summaries with a software update. The only answers are either to help users understand the drawbacks of the technology so they can make better-informed judgments or to remove or disable the feature completely. Apple is apparently going for the former.
We're oversimplifying a bit here, but generally, LLMs like those used for Apple's notification summaries work by predicting portions of words based on what came before and are not capable of truly understanding the content they're summarizing.
Further, these predictions are known to not be accurate all the time, with incorrect results occurring a few times per 100 or 1,000 outputs. Deploying this technology at scale without users really understanding how it works is risky at best, whether it's with the iPhone's summaries of news headlines in notifications or Google's AI summaries at the top of search engine results pages. Even if the vast majority of summaries are perfectly accurate, there will always be some users who see inaccurate information.
It's no wonder that we are stuck in an era of corrupt oligarchic regimes, weakened democracy, rising authoritarianism and an inability to solve our pressing problems.
I will add that this is not doomerism on my part. History always goes in cycles, the current regime will eventually reach a point were it won't be viable due to the weight of its own contradictions. But that doesn't mean we have yet reached the trough part of the cycle.
We are in for some interesting times.
Isn't the daringfireball blog a de facto front for Apple PR?