what do i do wrong? I have a Lenovo P50 with 64GB Ram, 2 fast ssd, but lame gpu and not the best proccesor i guess, but why does my LLM localy not perform? I me
A user on Mastodon is seeking help to understand why their local Large Language Model (LLM) setup is not performing well. Despite having a Lenovo P50 laptop with 64GB of RAM and fast SSDs, the user experiences poor performance, contrasting it with smaller Raspberry Pi machines that seem to handle AI tasks effectively. The user suspects their GPU or processor might be inadequate, though they later acknowledge the Raspberry Pi's advantage might stem from a specialized AI chip on its header. AI