Wow you just shined a ton of light on a problem my company had. We wanted to implement a medical imaging system from one of their subsidiaries, and it took an average of 3 months for the salesperson to respond to EACH of our emails
Wow you just shined a ton of light on a problem my company had. We wanted to implement a medical imaging system from one of their subsidiaries, and it took an average of 3 months for the salesperson to respond to EACH of our emails
The government has already stepped in several times. If you’re in the mood to get mad, read up on the results of these interventions. Basically, Boeing was almost forced to deal with actual oversight, but was able to convince the government at the last minute that they could handle the oversight themselves internally (thanks to the wonderful process of lobbying of course)
Ok but before you go, just want to make sure you know that this statement of yours is incorrect:
In the strictest technical terms AI, ML and Deep Learning are district, and they have specific applications
Actually, they are not the distinct, mutually exclusive fields you claim they are. ML is a subset of AI, and Deep Learning is a subset of ML. AI is a very broad term for programs that emulate human perception and learning. As you can see in the last intro paragraph of the AI wikipedia page (whoa, another source! aren’t these cool?), some examples of AI tools are listed:
including search and mathematical optimization, formal logic, artificial neural networks, and methods based on statistics, operations research, and economics
Some of these - mathematical optimization, formal logic, statistics, and artificial neural networks - comprise the field known as machine learning. If you’ll remember from my earlier citation about artificial neural networks, “deep learning” is when artificial neural networks have more than one hidden layer. Thus, DL is a subset of ML is a subset of AI (wow, sources are even cooler when there’s multiple of them that you can logically chain together! knowledge is fun).
Anyways, good day :)
When you want to cite sources like me instead of making personal attacks, I’ll be here 🙂
https://en.m.wikipedia.org/wiki/Large_language_model
LLMs are artificial neural networks
https://en.m.wikipedia.org/wiki/Neural_network_(machine_learning)
A network is typically called a deep neural network if it has at least 2 hidden layers
Sorry, it’s just that I work in a field where making distinctions is based on math and/or logic, while you’re making a distinction between AI- and non-AI-based image interpolation based on opinion and subjective observation
Interesting example, because tickets issued by automated cameras aren’t enforced in most places in the US. You can safely ignore those tickets and the police won’t do anything about it because they know how faulty these systems are and most of the cameras are owned by private companies anyway.
“Readable” is a subjective matter of interpretation, so again, I’m confused on how exactly you’re distinguishing good & pure fictional pixels from bad & evil fictional pixels
Normie, layman… as you’ve pointed out, it’s difficult to use these words without sounding condescending (which I didn’t mean to be). The media using words like “hallucinate” to describe linear algebra is necessary because most people just don’t know enough math to understand the fundamentals of deep learning - which is completely fine, people can’t know everything and everyone has their own specialties. But any time you simplify science so that it can be digestible by the masses, you lose critical information in the process, which can sometimes be harmfully misleading.
Both insert pixels that didn’t exist before, so where do we draw the line of how much of that is acceptable?
Everyone uses the word “hallucinate” when describing visual AI because it’s normie-friendly and cool sounding, but the results are a product of math. Very complex math, yes, but computers aren’t taking drugs and randomly pooping out images because computers can’t do anything truly random.
You know what else uses math? Basically every image modification algorithm, including resizing. I wonder how this judge would feel about viewing a 720p video on a 4k courtroom TV because “hallucination” takes place in that case too.
Happily playing modern games and developing shaders on my AMD GPU. 5120x1440 120 Hz issue free
I wish people would get their shit together and realize they’ve fallen victim to marketing
If it’s to “support game developers”, I’m guessing/hoping that means the ads will only show up in the Activities feature
That last picture is hilarious. How can that possibly be allowable evidence? There’s nothing distinguishing it from a screencap of a legit stream
replacing them with individual personalized echo chambers
Which really wouldn’t be that bad if instances were more clear about how they operate. Like, on the user signup page, there should be a big ol checkbox saying “I UNDERSTAND THAT ANYTHING THATS NOT A POSITIVE POST ABOUT COMMUNISM WILL GET ME BANNED” or whatever
Idk if I would advocate for or defend it, but I find mobile ads especially abhorrent cuz they take up more relative space on the screen and my upload speed isn’t good enough to be VPNing through my pihole anytime I’m outside the house
iOS browsers are just skins for Safari anyways, and Brave addresses my issue out of the box, so yeah
Because it blocks ads out of the box. I know its new tab screen causes a lot of y’all’s buttholes to clench because it mentions cryptocurrency, but there are harder things to ignore
Swap partitions disliked this comment
Vice articles are one of two things:
Is there really a shortage of these on the internet?
Pre-smartphones, my parents were always yelling at my brothers and I to stop texting at the dinner table.
Post-smartphones, it’s now vice-versa and unless we remind them to put their phones on silent beforehand, their phones inevitably erupt in alarms at full volume reminding them to call Marianne back or whatever
Considering how much Google has entrenched itself into the Internet (see manifest v3 fiasco), I would argue that creating a new browser is a fork of the web