• 1 Post
  • 28 Comments
Joined 1 year ago
cake
Cake day: June 6th, 2023

help-circle

  • Kashif Shah@lemmy.sdf.orgtoMildly Infuriating@lemmy.worldFor security reasons
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    6 months ago

    Haha that is a great idea! Give the landmine kill a special animation just to make sure that the cheaters get the message or let them figure it out in time lol?

    Heh, did you share that inventory technique on news.ycombinator? I could have sworn that I read a story there a team doing that.

    I know exactly what you are talking about - I was digging into the modding of this one game and happened upon a cheater’s forum. Blew my mind that the first step was to completely gut your computer’s security lol. But at the same time, was enlightening to see that. Seems like some of the work has been moved to the Anti-Cheat systems, but I’m guessing that there must be large gaps in what the AC can actually do for you at the application level?







  • However many years it takes for these LLM fools to wake up, hopefully they can find a way to laugh at themselves for thinking that it was cutting-edge to jam the internet into a fake jellyfish brain and calling it GPT. I haven’t looked recently, but I still haven’t seen anyone talking about neuroglial networks and how they will revolutionize the applications for AI.

    There’s a big*** book, but apparently no public takers in the deep neural network space?









  • That is a very valid and reasonable opinion, sorry to see it downvoted.

    There will be strong disagreement with you, however, on the case that LLMs are a big enough resource hog to require outright banning for just that reason.

    If you are looking for Big Tech hit boxes, try for things like writing laws that require all energy consumption in datacenters to be monitered and reported using established cross-disciplinary methods.

    Or getting people to stop buying phones every year. Or banning disposable vapes.


  • From earth.org: “Data centres are typically considered black box compared to other industries reporting their carbon footprint; thus, while researchers have estimated emissions, there is no explicit figure documenting the total power used by ChatGPT. The rapid growth of the AI sector combined with limited transparency means that the total electricity use and carbon emissions attributed to AI are unknown, and major cloud providers are not providing the necessary information.”



  • Kashif Shah@lemmy.sdf.orgtoMildly Infuriating@lemmy.worldFor security reasons
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    6 months ago

    Kudos for looking out first your future self - I had to leave the field entirely after it got to the point where I couldn’t stand to look at a computer anymore. Still can’t for more than an hour, two years later.

    I intend to reply more later, because this does deserve a longer reply, but I am short on steam.

    In the meantime, have you heard of login.gov? Check that out. The day that .com gets a hook into that is the day that identity problems are (mostly) solved.



  • Kashif Shah@lemmy.sdf.orgtoMildly Infuriating@lemmy.worldFor security reasons
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    6 months ago

    I spent about a decade in the enterprise software development space, so I totally get it. I couldn’t put it into words as well as you did, however.

    After watching the FCC bigwigs debate robocalls several years ago, I’ve become a believer in a future where your internet access is always authenticated to your real life ID, dark web excepted of course.

    In their case, it was posited as a best-in-class solution to the problem of spam in the telephony space. Same logic applies to email. I mean, look at what Twixxer did with the verified checkmark requiring a credit card. The trend is already there.

    I get the fear of being de-anonymized on the internet, but it may be the case of something we hate being something we need, when you start to throw deepfakes into the mix.