• 0 Posts
  • 22 Comments
Joined 1 year ago
cake
Cake day: June 20th, 2023

help-circle













  • Thank you! That helped to give some more insights into why exactly it’s bad, I knew it was but not sure through which processes. My doctorate was in neuroscience, and around the time I left academia, the gut microbiome-brain axis research was really starting to ramp up. It was the big buzzword at the time. But since I left to work in the industry, I haven’t really kept up with the developments in the gut microbiome neuroscience field.
    I really wish they’d find a better way to treat chronic cystitis than through antibiotics, but so far it’s the only treatment that really reliably helps.






  • Yeah, if you already have it then it’s not really an extra cost. But the smaller models perform less well and less reliably.

    In order to write a book that’s convincing enough to fool at least some buyers, I wouldn’t expect a Llama2 7B to do the trick, based on what I see in my work (ML engineer). But even at work, I run Llama2 70B quantized at most, not the full size one. Full size unquantized requires 320 GPU vram, and that’s just quite expensive (even more so when you have to rent it from cloud providers).

    Although if you already have a GPU that size at home, then of course you can run any LLM you like :)