-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
Related. I built karpathy’s llama2.c (https://github.com/karpathy/llama2.c) without modifications to WASM and run it in the browser. It was a fun exercise to directly compare native vs. Web perf. Getting 80% of native performance on my M1 Macbook Air and haven’t spent anytime optimizing the WASM side.
Demo: https://diegomarcos.com/llama2.c-web/
Code:
Cool. Nice example of Atwood's Law. [0] (however not really JS of course)
If somebody hasn't tried running LLMs yet, here are some lines that do the job in Google Colab or locally.
! git clone https://github.com/ggerganov/llama.cpp.git
! wget "https://huggingface.co/TheBloke/CodeLlama-7B-GGUF/resolve/ma..." -P llama.cpp/models
! cd llama.cpp && make
! ./llama.cpp/main -m ./llama.cpp/models/codellama-7b.Q8_0.gguf --color --ctx_size 2048 -n -1 -ins -b 256 --top_k 10000 --temp 0.2 --repeat_penalty 1.1 -t 8
[0] : https://en.wikipedia.org/wiki/Atwood's_Law
Yes, sorry for being unclear. I hope people who use shells will notice.
Shameless plug: https://github.com/jankovicsandras/ml <- here are some minimal Colab / Jupyter notebooks for absolute beginners.
I just find it amazing how little effort it takes to run an LLM nowdays.