• 0 Posts
  • 9 Comments
Joined 1 year ago
cake
Cake day: June 12th, 2023

help-circle
  • S410@kbin.socialtoLemmy Shitpost@lemmy.worldPlease Stop
    link
    fedilink
    arrow-up
    1
    arrow-down
    3
    ·
    4 months ago

    Well, why would banks replace the system which allows them to charge fees for every other interaction with their services? A blockain solution would allow multiple different banks (and, possibly, even regular people) to access the data with no middlemen, and, therefore, no fees. Or, well, no fees that directly end up in the bank’s pockets as profit, that is.

    Getting rid of that is bad for business. So, unless something magical happens and the EU, for example, pass a law requiring the banks to switch to a more de-centralized, more fair system, it’s not going to happen.


  • You can lose access to regular accounts as easily as to a blockchain. In fact, losing database of your password manager is even worse, because even if you have backups, they’re not going to be complete.

    With a blockchain all you have to worry is your private key. And you can write it down on a piece of paper, if you want, and put it away in a safe or a bank vault or something. Then, if you use it to restore your access years later, nothing will be lost.

    “There are 2 types of people in the world: those who make backups, and those who don’t make backups yet.”



  • “AI” models are, essentially, solvers for mathematical system that we, humans, cannot describe and create solvers for ourselves.

    For example, a calculator for pure numbers is a pretty simple device all the logic of which can be designed by a human directly. A language, thought? Or an image classifier? That is not possible to create by hand.

    With “AI” instead of designing all the logic manually, we create a system which can end up in a number of finite, yet still near infinite states, each of which defines behavior different from the other. By slowly tuning the model using existing data and checking its performance we (ideally) end up with a solver for some incredibly complex system.

    If we were to try to make a regular calculator that way and all we were giving the model was “2+2=4” it would memorize the equation without understanding it. That’s called “overfitting” and that’s something people being AI are trying their best to prevent from happening. It happens if the training data contains too many repeats of the same thing.

    However, if there is no repetition in the training set, the model is forced to actually learn the patterns in the data, instead of data itself.

    Essentially: if you’re training a model on single copyrighted work, you’re making a copy of that work via overfitting. If you’re using terabytes of diverse data, overfitting is minimized. Instead, the resulting model has actual understanding of the system you’re training it on.




  • USB-C is an interface that can be used for a variety of different things. There are different “levels” of power delivery, there’s thunderbolt, there’s DisplayPort-over-USB-C, etc. And for things to work, the devices on both ends of the cable and the cable itself must comply with any given standard.

    For example, on some laptops you can’t use a USB-C port with thunderbolt for charging the device, nor the port that supports power delivery to connect thunderbolt devices. While using the same physical interface, the ports are not interchangeable. Even if you’re connecting everything right, nothing is going to work if the cable you’re using isn’t specced properly (and trying to figure out the spec of a cable you have, considering they rarely have any labeling, is, definitely, fun).

    If anything, USB-C makes everything harder and more convoluted, because instead of using different ports and plugs for different standards, it’s now one port for nigh everything under the sun. If you want things to work, nowadays, you have to hunt down cable and port specs to ensure everything is mutually compatible.


  • USB-C makes things kinda worse, in a way.

    In the past you could slap together an adapter by chopping up some old cable and slapping it to a new power supply. And things would work, even if voltage or power ratings didn’t match exactly, or even at all (although, things would usually work much worse then).

    I’ve jury rigged an adapter for my laptop, which uses a 65w, 20v power brick, to run off a 45w, 16v one, when mine died and I needed to access the files. It worked, as long as I wasn’t using doing anything too computationally intensive on the thing.

    If the laptops used USB-C, that is very likely would not have worked at all. Chances are, the manufacturer of the smaller laptop would’ve bundled the cheapest power brick that covers the needs of the machine, so it would’ve most likely been 45w, 15v over power delivery. And mine would’ve been 65w, 20v over power delivery. And since everything in USB-C world has to talk to each other and agree beforehand, chances are, nothing would even try to work, even if it, realistically, can.