chrash0

joined 9 months ago
[–] chrash0 2 points 2 weeks ago* (last edited 2 weeks ago)

it’s a re-imagining of make specialized for the thing most people use make for: just running commands. simpler, more modern syntax and no aspirations of being a full build system.

[–] chrash0 2 points 3 weeks ago

this is just combining existing data scraping tools with LLMs to create a pretty flimsy and superfluous product. they use the data to do what they say. if they wanted to scrape data on you they can already do that. all they get from this is your interest and maybe some other PII like your email address. the LLM is just incidental here. it’s honestly not even as bad privacy wise as a “hot or not” or personality quiz.

[–] chrash0 12 points 1 month ago

the reactionary opinions are almost hilarious. they’re like “ha this AI is so dumb it can’t even do complex systems analysis! what a waste of time” when 5 years ago text generation was laughably unusable and AI generated images were all dog noses and birds.

[–] chrash0 20 points 1 month ago

you have to do a lot of squinting to accept this take.

so his wins were copying competitors, and even those products didn’t see success until they were completely revolutionized (Bing in 2024 is a Ballmer success? .NET becoming widespread is his doing?). one thing Nadela did was embrace the competitive landscape and open source with key acquisitions like GitHub and open sourcing .NET, and i honestly don’t have the time to fully rebuff this hot take. but i don’t think the Ballmer haters are totally off base here. even if some of the products started under Ballmer are now successful, it feels disingenuous to attribute their success to him. it’s like an alcoholic dad taking credit for his kid becoming an actor. Microsoft is successful despite him

[–] chrash0 24 points 1 month ago (4 children)

these days Hyprland but previously i3.

i basically live in the terminal unless i'm playing games or in the browser. these days i use most apps full screen and switch between desktops, and i launch apps using wofi/rofi. this has all become very specialized over the past decade, and it almost has a “security by obscurity” effect where it’s not obvious how to do anything on my machines unless you have my muscle memory.

not that i necessarily recommend this approach generally, but i find value in mostly using a keyboard to control my machines and minimizing visual clutter. i don’t even have desktop icons or a wallpaper.

[–] chrash0 98 points 3 months ago (3 children)

this is one of those facts i have to struggle to keep to myself to avoid coming off as an insufferable nerd

[–] chrash0 12 points 3 months ago (2 children)

All programs were developed in Python language (3.7.6). In addition, freely available Python libraries of NumPy (1.18.1) and Pandas (1.0.1) were used to manipulate data, cv2 (4.4.0) and matplotlib (3.1.3) were used to visualize, and scikit-learn (0.24.2) was used to implement RF. SqueezeNet and Grad-CAM were realized using the neural network library PyTorch (1.7.0). The DL network was trained and tested using a DL server mounted with an NVIDIA GeForce RTX 3090 GPU, 24 Intel Xeon CPUs, and 24 GB main memory

it’s interesting that they’re using pretty modest hardware (i assume they mean 24 cores not CPUs) and fairly outdated dependencies. also having their dependencies listed out like this is pretty adorable. it has academic-out-of-touch-not-a-software-dev vibes. makes you wonder how much further a project like this could go with decent technical support. like, all these talented engineers are using 10k times the power to work on generalist models like GPT that struggle at these kinds of tasks, while promising that it would work someday and trivializing them as “downstream tasks”. i think there’s definitely still room in machine learning for expert models; sucks they struggle for proper support.

[–] chrash0 3 points 3 months ago (1 children)

i haven’t personally had trouble with that since early 2023, but it depends on your dependencies

[–] chrash0 15 points 3 months ago (5 children)

i feel like if you’re not sat stationary at a workstation (who is these days) what you want is a laptop that’s good at being a laptop. 99% of the software developers i work with (not a small number) use Macbook Pros. they are well built, have good components, have best in class battery life (we’ll see how things shake out with Qualcomm), and are BSD based and therefore Unix compatible. my servers and gaming/CUDA PC? Linux all day. my laptop? Macbook. i’m not ideological enough to have range anxiety every time i step away from my desk. plus any decent sized org is going to have to administrate these machines, from scientists to administrators, and catering to .4% of your users is not a good ROI if your software vendors struggled for 8 years to get their Windows 98 based specialty sensor software to run on Mac.

that .4% is likely not 0 because they are nerds.

seriously tho if Qualcomm chips can make a Linux book that lasts all day i would happily make the switch

[–] chrash0 7 points 4 months ago* (last edited 4 months ago) (11 children)

i’m not really here to advocate for Rust in the kernel. i will say that i work on Rust professionally at a Fortune 100 company that is in the process of adopting it, which may skew my perception of it as mainstream, just to get the bias out of the way.

it is part of the project though, no? drivers still need to be interfaced with. so the people working on driver interfaces should be comfortable with it, again at least to preserve basic builds and do basic code review. this is specifically in reference to the issue that this thread is ostensibly started from: a kernel dev was getting worked up about “having to learn Rust”. so no, i don’t think it’s a strawman to point out the real people denying or frustrating patches just because they don’t understand the language. overly harsh maybe but not a total mischaracterization.

view more: next ›