Nvidia’s leaked roadmap suggests it’ll break its usual two year release cadence and release the RTX 5000 series in 2025, not 2024 as was anticipated.

  • Lengsel@latte.isnot.coffee
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    For that reasn I’m going to buy a 4060 and upgrade in 3 years so I can have AV1 encoding plus playing some older games so buy the cheapest now and then see how 50 series performs for the price.

    Watch for a potential Super series release in 6 months with more memory on each one.

    There’s been no talk if the 5000 will have cheaper release pricing since the 4000 can’t sell.

    • xNIBx@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      The problem with the 40xx cards isnt just the memory amount but also their memory bus, which is extremely limited. Nvidia is using an increased cache as a way of countering the limited memory bandwidth and that can work in some scenarios but not always.

      And while you can maybe increase the ram on some of these cards, you cant increase their memory bus without completely redesigning the card.

      • Lengsel@latte.isnot.coffee
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Yes the 128-bit does hurt it, if that is what you are referencing. Also, both companies are hindering the bottom end with the x8 limitation instead of x16.

    • gk99@kbin.social
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      Nvidia seems committed to their awful pricing and they’re raking in the money via enterprise AI nonsense, so I’d expect no price change.

      Not to mention that you’re still just buying their latest card so the strategy is working after all.

      • tal@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        1 year ago

        Nvidia seems committed to their awful pricing and they’re raking in the money via enterprise AI nonsense

        I think that they’re engaging in market segmentation. The AI guys need a lot of VRAM, and so Nvidia refrains from bumping VRAM on gaming cards, and charges an arm and a leg for the high-VRAM cards to extract more money from the people and companies doing AI work. That lets them capture both the gaming market at a lower price and the AI market at a higher price.

        Because of that segmentation, I don’t actually think that selling at a high price to the the AI market requires charging a lot for gaming cards.

        Ordinarily, absent formation of a cartel to collude to keep prices high, competition would prevent that – if a competitor could sell cards more-cheaply they can do so and take the marketshare, but AMD is presently playing catch-up in the AI field, so for now, Nvidia has a pretty free hand, as they’re the only game in town for many.

      • Lengsel@latte.isnot.coffee
        link
        fedilink
        arrow-up
        0
        ·
        1 year ago

        I’m still on a GTX card, no RTX, I want AV1 encoding, Intel Arc sucks for DX11 games, and 7600 is usually behind 4060, only a few game titles can the 7600 match it.

        • trynn@kbin.social
          link
          fedilink
          arrow-up
          0
          ·
          1 year ago

          The 4060 isn’t out yet and hasn’t had independent benchmarks reported on yet, so I’m not sure how you can say how it compares to the 7600. Unless you meant the 4060 Ti? But that card costs $200 more than a 7600, so it’s not really comparable.

          • Lengsel@latte.isnot.coffee
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            There’s a lot of review videos on YouTube released today of the 4060 for retail release tomorrow. The 3rd party review embargo lifted today.

            • trynn@kbin.social
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              Ah yeah, I see them now. I don’t think they had been published when I responded earlier.

              But yeesh, those benchmarks are pretty terrible for the price, at least if you’re looking for an apples-to-apples comparison without frame generation. Might as well save the 50 bucks and go with the 7600, since the average performance difference percentage is only in the single digits.

              • Lengsel@latte.isnot.coffee
                link
                fedilink
                arrow-up
                1
                ·
                1 year ago

                I also have a mild interest in raytracing performance for 2 older games, nothing like Cyberpunk level of demand. The 7600 is not at the same level for raytracing as 4060.

                Also for AVI encoding it seems programs like OBS have better support for nVidia encoding.

                I could try Intel but I do require native DX11 support and Intel is not an opton for that.