Here comes the push.

  • Cyborganism@lemmy.ca
    link
    fedilink
    English
    arrow-up
    84
    arrow-down
    2
    ·
    11 months ago

    LOL! To “make it safer”.

    No.

    It’s to increase share value by creating a rush to buy their graphics cards.

    • Eldritch@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      11 months ago

      Help help! Crypto never took off! Won’t someone think of our bottom line!

  • devbo@lemmy.world
    link
    fedilink
    English
    arrow-up
    37
    ·
    11 months ago

    why do CEOs never say “lets take our time to avoid making mistakes and insure quality”?

    • nicetriangle@kbin.social
      link
      fedilink
      arrow-up
      16
      ·
      11 months ago

      Capitalism big number must go up every quarter bullshit. Forces myopic short term decision making and will probably be the death of civilization.

    • magnetosphere@kbin.social
      link
      fedilink
      arrow-up
      4
      ·
      11 months ago

      Because that’s boring and doesn’t make good headlines. Not making exciting headlines means share prices stagnate. Taking your time also means that someone else might beat you to the punch. Neither of those things are good for the company’s bottom line or the CEOs business reputation.

    • Centillionaire@kbin.social
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      11 months ago

      Because they are not Miyamoto, helping churn out Zelda and Mario games that feel familiar, yet innovative. 🤌🏻

  • aelwero@lemmy.world
    link
    fedilink
    English
    arrow-up
    35
    arrow-down
    1
    ·
    11 months ago

    That’s like saying we should all drive faster to help identify shortcomings in traffic signals or vehicle safety features…

    Sounds like a flimsy ass false pretense to chase profits. Just my opinion, mind, but that doesn’t sound like something someone would posit at face value.

    • stifle867@programming.dev
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      1
      ·
      11 months ago

      What’s silicon valley’s favourite saying? Move fast and…make things safer? Close enough.

    • Katlah@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      11 months ago

      we should all drive faster to help identify shortcomings in traffic signals or vehicle safety features

      american traffic engineers in question

  • the_q@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    ·
    11 months ago

    Oh is that why? It’s not because Nvidia is making bank in the AI sector and he’s just another greedy CEO?

  • Dr. Dabbles@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    5
    ·
    11 months ago

    “Please buy more of my hardware so nobody finds out how deep in trouble my company is. PLEASE.”

      • Dr. Dabbles@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        6
        ·
        11 months ago

        Their “investments” in their largest customers are showing up as sales volume when they’re essentially giving products away. This coming year has four major companies coming to market with deadly serious competition, and the magic money investment in AI scams is drying up very quickly. So, if I was nvidia, doing what Jensen has been up to with the CEO-to-CEO customer meetings to arrange delivery timelines, and royally screwing over his most reliable channel partners, I’d hope with everything I have that the customers I have keep buying needlessly so the bubble never bursts.

    • Dr. Dabbles@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      3
      ·
      11 months ago

      Dedicated ASIC is where all the hotness lies. Flexibility of FPGA doesn’t seem to overcome its overhead for most users. Not sure if it will change when custom ASIC becomes too expensive again, and all the magic money furnaces run out of bills to burn.

      • just_another_person@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        2
        ·
        11 months ago

        ASIC are single purpose at the benefit of potential power efficiency improvements. Not at all useful something like running neutral networks, especially not when they are being retrained and updated.

        FPGAs are fully (re)programmable. There’s a reason why datacenters don’t lease ASIC instances.

        • Dr. Dabbles@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          3
          ·
          11 months ago

          Not at all useful something like running neutral networks

          Um. lol What? You may want to do your research here, because you’re so far off base I don’t think you’re even playing the right game.

          There’s a reason why datacenters don’t lease ASIC instances.

          Ok, so you should just go ahead and tell all the ASIC companies then.

          https://www.allaboutcircuits.com/news/intel-and-google-collaborate-on-computing-asic-data-centers/

          https://www.datacenterfrontier.com/servers/article/33005340/closer-look-metas-custom-asic-for-ai-computing

          https://ieeexplore.ieee.org/document/7551392

          Seriously. You realize that the most successful TPUs in the industry are ASICs, right? And that all the “AI” components in your phone are too? What are you even talking about here?

          • just_another_person@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            2
            ·
            11 months ago

            TPU units are specific to individual model frameworks, and engineers avoid using them for that reason. The most successful adoptions for them so far are vendor locked-in NN Models a la Amazon (Trainium), and Google (Coral), and neither of them has wide adoption since they have limited scopes. The GPU game being flexible in this arena is exactly why companies like OpenAI are struggling to justify the costs in using them over TPUs: it’s easy to run up front, but the cost is insane, and TPU is even more expensive in most cases. It’s also inflexible should you need to do something like multi-model inference (detection+evaluation+result…etc).

            As I said, ASICs are single purpose, so you’re stuck running a limited model engine (Tensorflow) and instruction set. They also take a lot of engineering effort to design, so unless you’re going all-in on a specific engine and thinking you’re going to be good for years, it’s short sighted to do so. If you read up, you’ll see the most commonly deployed edge boards in the world are…Jetsons.

            Enter FPGAs.

            FPGAs have speedup improvements for certain things like transcoding and inference in the 2x-5x range for specific workloads, and much higher for ML purposes and in-memory datasets (think Apache Ignite+Arrow workloads), and at a massive reduction in power and cooling, so obviously very attractive for datacenters to put into production. The newer slew of chips out are even reprogrammable “on the fly”, meaning a simple context switch and flash can take milliseconds, and multi-purpose workloads can exist in a single application, where this was problematic before.

            So unless you’ve got some articles about the most prescient AI companies currently using GPUs and moving to ASIC, the field is wide open for FPGA, and the datacenter adoption of such says it’s the path forward unless Nvidia starts kicking out more efficient devices.

            • Dr. Dabbles@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              2
              ·
              11 months ago

              Now ask open AI to type for you what the draw backs of FPGA is. Also the newest slew of chips is using partially charged NAND gates instead of FPGA.

              Almost all ASIC being used right now is implementing the basic math functions, activations, etc. and the higher level work is happening in more generalized silicon. You can not get the transistor densities necessary for modern accelerator work in FPGA.

              • just_another_person@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                2
                ·
                11 months ago

                Friend, I do this for a living, and I have no idea why you’re even bringing gating into the equation, because it doesn’t even matter.

                I’m assuming you’re a big crypto fan, because that’s about all I could say of ASIC in an HPC type of environment to be good for. Companies who pay the insane amounts of money for “AI” right now want a CHEAP solution, and ASIC is the most short-term, e-wastey, inflexible solve to that problem. When you get a job in the industry and understand the different vectors, let’s talk. Otherwise, you’re just spouting junk.

                • Dr. Dabbles@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  arrow-down
                  2
                  ·
                  11 months ago

                  I’m assuming you’re a big crypto fan

                  Swing and a miss.

                  because that’s about all I could say of ASIC in an HPC type of environment to be good for

                  Really? Gee, I think switching fabrics might have a thing to tell you. For someone that does this for a living, to not know the extremely common places that ASICs are used is a bit of a shock.

                  want a CHEAP solution

                  Yeah, I already covered that in my initial comment, thanks for repeating my idea back to me.

                  and ASIC is the most short-term

                  Literally being atabled to the Intel tiles in Sapphire Rapids and beyond. Used in every switch, network card, and millions of other devices. Every accelerator you can list is an ASIC. Shit, I’ve got a Xilinx Alveo 30 in my basement at home. But yeah, because you can get an FPGA instance in AWS, you think you know that ASICs aren’t used. lmao

                  e-wastey

                  I’ve got bad news for you about ML as a whole.

                  inflexible

                  Sometimes the flexibility of a device’s application isn’t the device itself, but how it’s used. Again, if I can do thousands, tens of thousands, or hundreds of thousands of integer operations in a tenth of the power, and a tenth of the clock cycles, then load those results into a segment of activation functions that can do the same, and all I have to do is move this data with HBM and perhaps add some cheap ARM cores, bridge all of this into a single SoC product, and sell them on the open market, well then I’ve created every single modern ARM product that has ML acceleration. And also nvidia’s latest products.

                  Woops.

                  When you get a job in the industry

                  I’ve been a hardware engineer for longer than you’ve been alive, most likely. I built my first FPGA product in the 90s. I strongly suspect you just found this hammer and don’t actually know what the market as a whole entails, let alone the long LONG history of all of these things.

                  Do look up ASICs in switching, BTW. You might learn something.

  • li10@feddit.uk
    link
    fedilink
    English
    arrow-up
    6
    ·
    11 months ago

    I guess we’re just waiting until something awful happens before it’s properly regulated.

    Not surprising.

  • spudwart@spudwart.com
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    Nvidia has pissed off the pc gaming industry, which is why their focus has shifted to the server markets.

    Now in the server markets they’re not top dog, so they want to straddle the fence.

    This divided focus will be their undoing.

    • misk@sopuli.xyzOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      Nvidia is top dog in machine learning accelerator / server market which is why they have neglected GPU market. They just want more.

      • spudwart@spudwart.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        If their winning position was truly “top dog” they wouldn’t be shivering, shaking and pissing themselves begging for people to purchase their crap using insane lines like “The more you buy the more you save.”

        This is total and complete panic.

          • spudwart@spudwart.com
            link
            fedilink
            English
            arrow-up
            1
            ·
            11 months ago

            This is precisely why they’re panicked. They’ve just had record stock numbers, record income, record everything. This system depends on constant growth. They need to keep that growth going or the shareholders will pull out, and it will cause their stock to crash.

            This all built up just like the Crypto-Rush of 2020-2021. Which you can see in the chart. It went up, peaked, and then dropped when crypto crashed. They’re betting big on AI, when it turns out AI isn’t as profitable as they thought it would be, it will crash also.

            • misk@sopuli.xyzOP
              link
              fedilink
              English
              arrow-up
              1
              ·
              11 months ago

              You seem to change narrative / move goalposts with every response.

              Nvidia is okay, nobody expects another 1000% stock price increase. Crypto mania is a faint memory when looking at the profits from ML. They can’t keep up with demand to the point of partially dropping some market segments. Their customers will keep gobbling up hardware for foreseeable future because there are legitimate uses for it and data scientists / engineers will keep processing larger and larger datasets.

              If anything, they’re likely scared of competition from big tech players doing custom chips. Amazon, Google, Microsoft all sell cloud computing resources and already do own sillicon design. They’re definitely looking to have piece of that “AI” pie too.

              Nvidia puts out sleazy statements like the one I’ve linked to nudge regulators to make that pie even bigger.

              • spudwart@spudwart.com
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                11 months ago

                A top dog position is a slow & steady growth or consistent government backing. Sharp growths like they have been having consistently lead to crashes. They don’t need to even have a loss, they just need to under-perform for their next quarterly projections.

                My pointing out cryptomania was not for the fact that Crypto is why they failed. It’s that Crypto gave them market boost, followed by a crash when crypto began to crash. This crash wasn’t explicitly because crypto crashed, it was because their projects weren’t met as a result of crypto’s crash. Crypto doesn’t need to be involved for their projections to falter. They can be ahead, and be winning the game today. But that doesn’t make them Top dog. Top dog is stable, top dog doesn’t sit at the precipice of another crash.

                Their entire system is built upon pump & dump style investments as a result of the crypto era. They chase quick and easy money. But every time it results in a steep crash afterward. This will be their undoing because this will hurt their reputation, as it already has. The quest for more money, comes at the expense of everything else.

                Top dog doesn’t need to worry about reputation in the long run, because they’ll maintain their place in the long run. An aware front runner will make note of their position and act accordingly. Nvidia is a front runner, planning like a top dog, acting like a top dog.

                They are panicked, because they’re not Microsoft, Google, Amazon, IBM or Oracle. They don’t have a decades long presence in the economy that if they go under, everyone goes down with them.

                These top dogs don’t even need to worry about their stocks because they will get bailed out. Nvidia is banking on the idea that they’re just as needed. But if they really were just as needed, they wouldn’t have to peacock or nudge. They’d just have to threaten that they’re going to go under, and governments would panic in fear of an economic collapse.

                This is a gamble, and one they will lose. Their decision to divide their position and go after lottery tickets, both in the stock market and in regulations will be their undoing.

                If they were needed, they wouldn’t have any need to look needed.