As we know, we can attach a M.2 NVMe SSD via PCIe expansion board, and connect it via FFC cable to Raspberry Pi 5 on PCIe slot, it will provide high perfornance on data transfering and speed improvement. and at the same time, the local large language model is more and more popular right now, I want to build my own Local LLM server, so I've tried installed ollama on my Raspberry Pi 5 and pulling down llama3 and phi3 model to build my own AI agent, it works fine, although it is a little bit slow, but I have purchased Hailo-8L AI co-processor for further development. BTW, I almost forot the PoE hat, it will provides 5v@4.5A power to my Raspberry Pi via Ethernet cable which connected to my Meert AYDLINLATMA PoE Power Supply( OUTPUT: 48V/1A 48W 802.3at protocol)
It works fine and I am going to build webUI such as lobe-chat or other UI to deploy it on my local network.