wttr gang
wttr gang
I believe your “They use attention mechanisms to figure out which parts of the text are important” is just a restatement of my “break it into contextual chunks”, no?
Large language models literally do subspace projections on text to break it into contextual chunks, and then memorize the chunks. That’s how they’re defined.
Source: the paper that defined the transformer architecture and formulas for large language models, which has been cited in academic sources 85,000 times alone https://arxiv.org/abs/1706.03762
I don’t have a very consistent naming theme. I’ve used various names related to music, science, and art. I have a decomissioned machine named “numbers” for example.
However, I would like to point out we have plenty more than 8 celestial bodies of interest in the solar system if you include Eris, Ceres, Pluto, Makemake, the moons of Jupiter, and more. It might not be indefinitely extendable, but may help in the short term.
Definitely RE for me. I couldn’t sleep after the first time I saw a crimson head. The sharks were terrifying too