Samsung is reportedly preparing to wind down its SATA SSD business, and a notable hardware leaker warns the move could have broader implications for consumer storage pricing than Micron’s decision to end its Crucial RAM lineup. The report suggests reduced supply and short-term price pressure may follow as the market adjusts.
That’s the entire point. It’s a scam.
Compared to crypto and NFTs, there is at least something in this mix, not that I could identify it.
I’ve become increasingly comfortable with LLM usage, to the point that myself from last year would hate me. Compared to projects I used to do with where I’d be deep into Google Reddit and Wikipedia, ChatGPT gives me pretty good answers much more quickly, and far more tailored to my needs.
I’m getting into home labs, and currently everything I have runs on ass old laptops and phones, but I do daydream if the day where I can run an ethically and sustainably trained, LLM myself that compares to current GPT-5 because as much as I hate to say it, it’s really useful to my life to have a sometimes incorrect but overalls knowledgeable voice that’s perpetually ready to support me.
The irony is that I’ll never build a server that can run a local LLM due to the price hikes caused by the technology in the first place.
Please hate yourself, reflect on that and walk back from contributing to destroying the environment by furthering widespread adoption of this shitty technology. The only reason you seem to get “useful answers” is because of search engine and website enshittification. What you are getting is still tons worse than a good web research 10 years ago.
Basically you were taught to enjoy rancid butter because all restaurants around you had started tasting like shit first, then someone opened a rancid butter shop.
I do agree entirely. If I could use the internet of 2015 I would, but I can’t do so in a practical way that isn’t much more tedious than asking an LLM.
My options are the least rancid butter of the rancid butter restaurants or I churn my own. I’d love to churn my own and daydream of it, but I am busy, and can barely manage to die on every other hill I’ve chosen.
web search isnt magically going back to how it was, and its not just search engines its every mf trying tk take advantage of seo and push their content to the top, search is going to get worse evry year, ai did speed it up by making a bunch of ai images pop up whenever you search an image
problem is that the widespread use of (and thereby provision of your data to) LLMs contributes to the rise of totalitarian regimes, wage-slavery and destroying our planet’s ecosystem. Not a single problem in any of our lives is important enough to justify this. And convenience because we are too lazy to think for ourselves, or to do some longer (more effort) web research, is definitely not a good excuse to be complicit in murder, torture and ecoterrorism.
I agree except for the fact that it’s unavoidable
It’s horrific, but its inescapable, the problem is not going away and while you’re refusing to use LLMs to accelerate your progress, the opposition isn’t
Don’t get me wrong, anyone who blindly believes sycophantic LLM garbage is a fool.
Its taken 4 years to overcome my llm moral ocd - and its only because I need to start working, in a world where every company forces AI down your throat, there are many who simply have no choice if they want to compete
Also I’m kinda glad I can spend more of my useful energy working towards my goals rather than battling the exact minutiae without any sort of guide
The thing is: LLMs do not accelerate the progress of proper software development. Your processes have to be truly broken to be able to experience a net gain from using LLMs. It enables shitty coders to output pull requests that look like they were written by someone competent, and thereby effectively waste the time of skilled developers who review such pull requests out of respect for the contributor, only to find out it is utter garbage.
I’m sorry but this is simply incorrect.
What you are saying is true of pure vibe coding, but you get an insane net gain from using LLMs for sanity checks and as a general ‘bootstrap’ for a project.
Yes it will make mistakes, but if you can actually program well so as to understand when it is incorrect, it is incredibly helpful as a tool.
if you get a net gain, you are an untalented hack and shouldn’t be let anywhere near SW development.
It’s the difference between a pyramid scheme and an MLM: one of them has a product in the mix.