mistral-src
slint
mistral-src | slint | |
---|---|---|
9 | 138 | |
8,732 | 15,163 | |
4.1% | 3.4% | |
7.3 | 9.9 | |
about 2 months ago | 4 days ago | |
Jupyter Notebook | Rust | |
Apache License 2.0 | GNU General Public License v3.0 Or Slint Royalty-Free |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
mistral-src
-
Mistral 7B vs. Mixtral 8x7B
A French startup, Mistral AI has released two impressive large language models (LLMs) - Mistral 7B and Mixtral 8x7B. These models push the boundaries of performance and introduce a better architectural innovation aimed at optimizing inference speed and computational efficiency.
-
How to have your own ChatGPT on your machine (and make him discussed with himself)
However, some models are publicly available. Itβs the case for Mistral, a fast, and efficient French model which seems to outperform GPT4 on some tasks. And it is under Apache 2.0 license π.
-
How to Serve LLM Completions in Production
I recommend starting either with llama2 or Mistral. You need to download the pretrained weights and convert them into GGUF format before they can be used with llama.cpp.
-
Stuff we figured out about AI in 2023
> Instead, it turns out a few hundred lines of Python is genuinely enough to train a basic version!
actually its not just a basic version. Llama 1/2's model.py is 500 lines: https://github.com/facebookresearch/llama/blob/main/llama/mo...
Mistral (is rumored to have) forked llama and is 369 lines: https://github.com/mistralai/mistral-src/blob/main/mistral/m...
and both of these are SOTA open source models.
-
How Open is Generative AI? Part 2
MistralAI, a French startup, developed a 7.3 billion parameter LLM named Mistral for various applications. Committed to open-sourcing its technology under Apache 2.0, the training dataset details for Mistral remain undisclosed. The Mistral Instruct model was fine-tuned using publicly available instruction datasets from the Hugging Face repository, though specifics about the licenses and potential constraints are not detailed. Recently, MistralAI released Mixtral 8x7B, a model based on the sparse mixture of experts (SMoE) architecture, consisting of several specialized models (likely eight, as suggested by its name) activated as needed.
- Mistral website was just updated
- Mistral AI β open-source models
- Mistral 8x7B 32k model [magnet]
-
Ask HN: Why the LLaMA code base is so short
I was getting into LLM and I pick up some projects. I tried to dive into the code to see what is secret sauce.
But the code is so short to the point there is nothing to really read.
https://github.com/facebookresearch/llama
I then proceed to check https://github.com/mistralai/mistral-src and suprsingly it's same.
What is exactly those codebases? It feels like just download the models.
slint
-
Ask HN: Why would you ever use C++ for a new project over Rust?
Did you get a chance to check https://slint.dev?
Disclaimer: I work for Slint
-
Deno in 2023
Currently, we do it by using binaries through napi-rs so we can bring in a window using the platform native API. And then we do some hack to merge the event loops.
But if Deno supports bringing up a window directly, this means we can just ship wasm instead of native binary for all platform. And also I hope event loop integration will be simplified.
Although we'd also need more API than just showing a window (mouse and keyboard input, accessibility, popup window, system tray, ...)
[1] https://slint.dev
-
Slint GUI Toolkit
Rich Text content is not yet implemented. This is tracked in https://github.com/slint-ui/slint/issues/2723
Thanks for reporting the broken link. Fixed in https://github.com/slint-ui/slint/commit/9200480b532f49007d2...
-
slint VS rinf - a user suggested alternative
2 projects | 24 Jan 2024
-
A 2024 Plea for Lean Software
With Slint (https://slint.dev) we're trying to make a lightweight toolkit that doesn't use HTML/CSS. And that you can program either from low level languages such as C++ or Rust. As well as with higher level language such as JavaScript, and we want to extend to python too.
-
Immediate Mode GUI Programming
I haven't. I was just searching for a GUI library that was Bevy-compatible and slint isn't at the moment: https://github.com/slint-ui/slint/discussions/940
Sorry!
-
Why the M2 is more advanced that it seemed
Trying to do that with Slint: https://slint.dev
- 9 years of Apple text editor solo dev
-
The Linux graphics stack in a nutshell, part 1
You can do that with Slint (https://slint.dev) and its linuxkms backend. No need for a xorg server or wayland compositor, just run the application made with Slint from the init script.
- Qt 6.6 and 6.7 Make QML Faster Than Ever: A New Benchmark and Analysis
What are some alternatives?
ReAct - [ICLR 2023] ReAct: Synergizing Reasoning and Acting in Language Models
tauri - Build smaller, faster, and more secure desktop applications with a web frontend.
lida - Automatic Generation of Visualizations and Infographics using Large Language Models
iced - A cross-platform GUI library for Rust, inspired by Elm
ragas - Evaluation framework for your Retrieval Augmented Generation (RAG) pipelines
egui - egui: an easy-to-use immediate mode GUI in Rust that runs on both web and native
vllm - A high-throughput and memory-efficient inference and serving engine for LLMs
lvgl - Embedded graphics library to create beautiful UIs for any MCU, MPU and display type.
llama - Inference code for Llama models
dioxus - Fullstack GUI library for web, desktop, mobile, and more.
text-generation-webui-colab - A colab gradio web UI for running Large Language Models
cxx-qt - Safe interop between Rust and Qt