-
coral-pi-rest-server
Perform inferencing of tensorflow-lite models on an RPi with acceleration from Coral USB stick
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
However, a pi doesn't have the strength to run something like Llama.cpp, of course, so I've been considering using something like the Coral USB Accelerator (https://coral.ai/products/accelerator). As I've been learning more about it, it seems to be very geared towards TensorFlow Lite models. But whisper.cpp and Llama.cpp use ggml models.
NOTE:
The number of mentions on this list indicates mentions on common posts plus user suggested alternatives.
Hence, a higher number means a more popular project.