FWIW, LLMs are deterministic. Usually the commercial front-ends don’t let you set the seed but behind the scenes the only reason the output changes each time it’s that the seed changes. If you set a fixed seed, input X always leads to output Y.
From the user perspective: no? I think they called that “temperature” and even setting that to 0 didn’t make the result the same the next day after cache cleared.
FWIW, LLMs are deterministic. Usually the commercial front-ends don’t let you set the seed but behind the scenes the only reason the output changes each time it’s that the seed changes. If you set a fixed seed, input X always leads to output Y.
From the user perspective: no? I think they called that “temperature” and even setting that to 0 didn’t make the result the same the next day after cache cleared.