Przejdź do głównej zawartości


So, I learned that #SBC #ComputeModule newest models tend to be "#MachineLearning optimized". If I am going to try to build a personal equivalent of #StarTrek #Tricorder, or #Fallout #PipBoy, (a) is it going to be some sensible use for this feature, (b) I should just ignore it, or (c) is it better to get older version, without it?
in reply to 8Petros [$ rm -rv /capitalism/*]

I'd say b)
a) depends on the amount of RAM accessible to the GPU. An LLM like llama 3.1 (ChatGPT-like) requires a GPU with 24GB of RAM.

But if it had a camera it could be useful to have the "MachineLearning optimized" SBC. But mostly b)