which is central to the company’s data-as-codewords strategy of shrinking the size of data being transferred by an average of 75 percent in near real-time, resulting in an average four-times increase in effective bandwidth.
Hardware
All things related to technology hardware, with a focus on computing hardware.
Rules (Click to Expand):
-
Follow the Lemmy.world Rules - https://mastodon.world/about
-
Be kind. No bullying, harassment, racism, sexism etc. against other users.
-
No Spam, illegal content, or NSFW content.
-
Please stay on topic, adjacent topics (e.g. software) are fine if they are strongly relevant to technology hardware. Another example would be business news for hardware-focused companies.
-
Please try and post original sources when possible (as opposed to summaries).
-
If posting an archived version of the article, please include a URL link to the original article in the body of the post.
Some other hardware communities across Lemmy:
- Augmented Reality - [email protected]
- Gaming Laptops - [email protected]
- Laptops - [email protected]
- Linux Hardware - [email protected]
- Microcontrollers - [email protected]
- Monitors - [email protected]
- Raspberry Pi - [email protected]
- Retro Computing - [email protected]
- Single Board Computers - [email protected]
- Virtual Reality - [email protected]
Icon by "icon lauk" under CC BY 3.0
Sounds like middle-out compression to me
Sound like a novel and genuinely useful approach to reducing data transfer sizes.
From my understanding, this is an ML powered "on the fly" compression scheme which optimizes based on your particular workload type.
Even conceptually, this makes a lot of sense.
I am assuming this would only work with certain use cases. E.g. I can't imagine this would work well with streaming video (e.g. AV1).
That is why I don’t believe it. There are certainly workloads where you can save 75% or more of the bandwidth by compressing on-the-fly. Most of the data that’s going to go over your typical cloud services pipe will be images, streaming video, and gzipped html/json. I think saying that you can save 75% of that, but it’s not compression, and the explanation for why it is not compression is pure word salad, means they are lying.
Sounds like Huffman encoding (zip) with file indexing and streaming. Woo.
But you definitely could do data mining and other things on compressed files.