Before I understood Docker, I used to have HA installed directly on bare metal side by side with other “desktop” apps.
To be able to access devices, HA needs many different OS-level configurations (users, startup, binding serial ports, and much more I don’t have a clue about). It was a giant mess. The bare OS configuration was polluted with HA configurations. Worse, on updating HA, not only did these configurations change, the installation of HA changed enough that every update would break HA and even the bare OS would break in some ways because of configuration conflicts.
Could this be managed properly through long term migration? Yeah, probably, but this is probably a ton of work, for which a purpose-built solution already exists: Docker. Between that and the extra layer of security afforded by dedicating an OS to HA (bare metal or virtualized), discouraging the installation of HA in a non-dedicated environment was a no brainer.
This is what happens when stack overflow is used for training.
I think this meme would be 450% better with parralax
Generative AI is not smart to begin with. LLM are basically just compressed versions of the internet that predict statistically what a sentence needs to be to look “right”. There’s a big difference between appearing right and being right. Without a critical approach to information, independent reasoning, individual sensing, these AI’s are incapable of any meaningful intelligence.
In my experience, the emperor and most people around them still has not figured this out yet.
I think this is satire. Poe’s law is stronger than ever
You use LLMs for everything? Seems strange, as they don’t reason. They are specifically designed to mimic human speech. So they are great for tasks that require presenting information that looks intelligible, or at least are very easily testable, but beyond that you run into serious issues with hallucination fast…
Or do you mean “AI” as in data science and automation? That’s a very different thing which is a bit off topic. That Kind of “AI” is neither new nor has the hallucination/ecological/cost/training effort issues associated with it
I dunno dude, all your answers talk about “AI” in suspiciously vague terms. “I use AI to …” is the new “built with blockchain”. Skip the marketing terms and talk shop.
Sounds like neither of you watched the video. Fortunately, I did so here’s a quick summary. The thesis is that music is getting worse, for a few reasons. Author argues:
The first point has been touched on by many other people. It’s a common trend in a lot of places outside of music too. People are replaced with machines and processes in a lot of settings especially in corporations and commerce, and while that’s great for efficiency and predictability, it creates a sterile landscape devoid of human expression. This is not to say all music has this. But mass market music is a chief culprit.
The other point really resonates with me with videogames and videogame sales. You can get a dozen great steam games for the same price as a single Nintendo title, yet I probably put 10x the time into that one Nintendo title than all the other steam games combined. Had to get every bit of value out of that expensive Nintendo purchase. YMMV on this point though. I don’t stream music so I can’t say how it has affected me personally.
Terminals are powerful and flexible, but still slower than a dedicated UI to see states at a glance, issue routine commands, or do text editing.
Terminal absolutists are as insufferable as GUI purists. There is a place and time for both.
The problem is the hysteria behind it, leading people to confuse good sounding information with good information. At least when people generally produce information they tend to make an effort to get it right. Machine learning is just an uncaring bullshitting machine, that is rewarded on the basis of the ability to fool people (turns out the Turing test was a crappy benchmark for practice-ready AI besides writing poems), and VC money hasn’t reached the “find out” phase of that looming lesson, when we all just get collectively exhausted by how underwhelming the AI fad is.
They are neither ham nor steamed, unless that’s exactly what they are having over there? Wouldn’t compare it to a hamburger though… (A hamburger being a Hamburg steak, as in the German city)
Epic sax guy and ska is a very strange juxtaposition.
Can’t tell if you are joking. I know a lot of junior developers who think this is a legitimate solution.
Bro, that’s called enlightenment.
Is this the before or the after picture?
The wording of the article implies an apples to apples comparison. So 1 Google search == 1 question successfully answered by an LLM. Remember a Google Search in layspeak is not the act of clicking on the search button, rather it’s the act of going to Google to find a website that has information you want. The equivalent with ChatGPT would be to start a “conversation” and getting information you want on a particular topic.
How many search engine queries, or LLM prompts that involves, or how broad the topic, is a level of technical detail that one assumes the source for the number x25 has already controlled for (Feel free to ask the author for the source and share with us though!)
Anyone who’s remotely used any kind of deep learning will know right away that deep learning uses an order of magnitude or two more power (and an order of magnitude or two more performance!) compared to algorithmic and rules based software, and a number like x25 for a similar effective outcome would not at all be surprising, if the approach used is unnecessarily complex.
For example, I could write a neural network to compute 2+2, or I could use an arithmetic calculator. One requires a 500$ GPU consuming 300 watts, the other a 2$ pocket calculator running on 5 watts, returning the answer before the neural network is even done booting.
Asking your employer for more compensation because you are exerting more effort due to inexperience isn’t so different than a AAA studio charging high fees for a crappy product because of corporate bullshit and inefficiency.
In fact, these two things tend to be two sides of the same coin.
Article summary: Japan’s system is not interchangable with systems outside Japan, which is a friction point for export.
I also have the same question. So I upvoted you. Also, where does this fall on the chart?
How about we ban software in cars in general, beyond basic engine control.