Get Scout setup in minutes, and let us sweat the small stuff. A couple lines in settings.py is all you need to start monitoring your apps. Sign up for our free tier today. Learn more →
Contextualized-topic-models Alternatives
Similar projects and alternatives to contextualized-topic-models
-
Scout Monitoring
Free Django app performance insights with Scout Monitoring. Get Scout setup in minutes, and let us sweat the small stuff. A couple lines in settings.py is all you need to start monitoring your apps. Sign up for our free tier today.
-
OCTIS
OCTIS: Comparing Topic Models is Simple! A python package to optimize and evaluate topic models (accepted at EACL2021 demo track)
-
tika-python
Tika-Python is a Python binding to the Apache Tika™ REST services allowing Tika to be called natively in the Python community.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
mlconjug3
A Python library to conjugate verbs in French, English, Spanish, Italian, Portuguese and Romanian (more soon) using Machine Learning techniques.
contextualized-topic-models reviews and mentions
-
[Project]Topic modelling of tweets from the same user
In our experiments, CTM works well with tweets: https://github.com/MilaNLProc/contextualized-topic-models (I'm one of the authors)
-
Extract words from large data set of reviews by sentiment
Use CTM https://github.com/MilaNLProc/contextualized-topic-models with sentiment labels to built distribution of words over labels
-
Using Transformer for Topic Modeling - what are the options?
This library from MILA seems quite neat! I haven’t had the change to play with it though : https://github.com/MilaNLProc/contextualized-topic-models
-
Catogorize the Data- Topic Modelling algorithm
a bit of shameless self-promotion, but we developed a topic model (https://github.com/MilaNLProc/contextualized-topic-models) that actually supports that use case!
-
(NLP) Best practices for topic modeling and generating interesting topics?
If you use CTM, you can provide the topic model two inputs: the preprocessed texts (that will be used by the topic model to generate the topical words) and the unpreprocessed texts (to generate the contextualized representations that will be later concatenated to the document bag-of-word representation). We saw that this slightly improves the performance instead of providing BERT the already-preprocessed text. This feature is supported in the original implementation of CTM, not in OCTIS. See here: https://github.com/MilaNLProc/contextualized-topic-models#combined-topic-model
-
Latest trends in topic modelling?
Cross-lingual Contextualized Topic Models with Zero-shot Learning from a team at MilaNLP which uses bag of words representations in combination with multi lingual embeddings from SBERT and works like a VAE (encode the input, use the encoded representation to decode back to a bag of words as close to the input as possible). Using SBERT embeddings makes their model generalise for other languages which may be useful. One major shortfall of this model as I understand is that it can't deal with long documents very elegantly - only up to BERT'S word limit (the workaround is to truncate and use the first words)
-
A note from our sponsor - Scout Monitoring
www.scoutapm.com | 2 Jun 2024
Stats
MilaNLProc/contextualized-topic-models is an open source project licensed under MIT License which is an OSI approved license.
The primary programming language of contextualized-topic-models is Python.
Popular Comparisons
- contextualized-topic-models VS BERTopic
- contextualized-topic-models VS OCTIS
- contextualized-topic-models VS PolyFuzz
- contextualized-topic-models VS tika-python
- contextualized-topic-models VS transformers
- contextualized-topic-models VS Top2Vec
- contextualized-topic-models VS Sentimentanalysis
- contextualized-topic-models VS mlconjug3
- contextualized-topic-models VS TopMost
- contextualized-topic-models VS snscrape
Sponsored