Post

Avatar
7B means 7 million right, so 6 is fine? 👀
My nightmare activity for the week can be boiled down to 'training a doomed LLM to prove a point', so that will be fun.
Avatar
these all sound like rounding errors to me, but as a large language model, idk what a rounding error is anyway so i think the math is accurate
Avatar
I have consulted my model and it is very certain that a rounding error is either “a decile value” or “a value between 0.1 and 0.1” so like… what is truth anyway