Super computers have been doing this sort of stuff for decades without much problem
Idk if I’d point at a supercomputer system and suggest it was constructed “without much problem”. Cray has significantly lagged the computer market as a whole.
the main issue is on training for LLMs inference is pretty computationally cheap
Again, I would not consider anything in the LLM marketplace particularly cheap. Seems like they’re losing money rapidly.
Idk if I’d point at a supercomputer system and suggest it was constructed “without much problem”. Cray has significantly lagged the computer market as a whole.
Again, I would not consider anything in the LLM marketplace particularly cheap. Seems like they’re losing money rapidly.