Big fan of SBC gaming, open source engine recreations/source ports, gaming in general, alternative operating systems, and all things modding.

Trying to post and comment often in an effort to add to Lemmy’s growth.

  • 146 Posts
  • 1.66K Comments
Joined 9 months ago
cake
Cake day: August 13th, 2023

help-circle







  • I would hold onto it for a bit. If you want to run software that isn’t from Steam like the game modding app Vortex for example it can be easier to tinker with it on Windows.

    Most of it can run on Windows just fine with Wine or Lutris but you might run into the odd graphical glitch that can be a headache if you are inexperienced especially if you rarely use that application.












  • Corroded@leminal.spaceOPtoWikipedia@lemmy.worldAI Winter
    link
    fedilink
    English
    arrow-up
    2
    ·
    9 days ago

    I feel like it’s more inline with a staircase where progress is made followed plateaus.

    If for example we can talk to voice assistants in a more human matter and ask follow up questions like “What was that last part you said?” I don’t think there would be much reason to regress



  • Corroded@leminal.spaceOPtoWikipedia@lemmy.worldAI Winter
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    10 days ago

    In the history of artificial intelligence, an AI winter is a period of reduced funding and interest in artificial intelligence research. The field has experienced several hype cycles, followed by disappointment and criticism, followed by funding cuts, followed by renewed interest years or even decades later.


    There were two major winters approximately 1974–1980 and 1987–2000, and several smaller episodes, including the following:

    • 1966: failure of machine translation
    • 1969: criticism of perceptrons (early, single-layer artificial neural networks)
    • 1971–75: DARPA’s frustration with the Speech Understanding Research program at Carnegie Mellon University
    • 1973: large decrease in AI research in the United Kingdom in response to the Lighthill report
    • 1973–74: DARPA’s cutbacks to academic AI research in general
    • 1987: collapse of the LISP machine market
    • 1988: cancellation of new spending on AI by the Strategic Computing Initiative
    • 1990s: many expert systems were abandoned
    • 1990s: end of the Fifth Generation computer project’s original goals



  • Am I the only one who finds this so weird when we talk about LLMs? If someone makes a bot that resembles some specific person, that person’s rights aren’t really violated, and since they’re all fictional content, it is very hard to break actual laws through its content. At that point we would have to also ban people’s weird fan fiction, no?

    Not arguing about whatever they want or don’t want on their platform, but the legal & alleged moral questions / arguments always weird me out a bit, because there’s no one actually getting hurt in any sort of way by weirdos having weird chats with computers.

    I could see some people making the argument that it could be considered defamatory especially in cases where it is being peddled as real. Politicians might even try to link it in with revenge porn or other non-consensual pornography laws.

    It would sure get messy in a hurry though. Imagine someone trying to make lewd photos of Tomb Raider’s Laura Croft for example and accidentally generates images resembling Alicia Vikander or Angelina Jolie from the Tomb Raider movie.