🤓 Jules Belveze ┣━━ 📦 Open Source ┃ ┣━━ tsa - Dual-attention autoencoder ┃ ┣━━ bert-squeeze - Speed up Transformer models ┃ ┣━━ bundler - Learn from your data ┃ ┣━━ nhelper - Behavioral testing ┃ ┗━━ time-series-dataset - Dataset utilities ┣━━ 👍 Contributions ┃ ┣━━ 🤗 Hugging Face Ecosystem ┃ ┃ ┣━━ t5-small-headline-generation - t5 for headline generation ┃ ┃ ┗━━ tldr_news - Summarization dataset ┃ ┣━━ ❄️ John Snow Labs Ecosystem ┃ ┃ ┗━━ langtest - Deliver safe & effective NLP models ┃ ┣━━ 🧹 Dust ┃ ┃ ┗━━ Dust - Customizable and secure AI assistants. ┃ ┣━━ 💫 SpaCy Ecosystem ┃ ┃ ┗━━ concepCy - SpaCy wrapper for ConceptNet ┃ ┣━━ bulk - contributed the color feature ┃ ┗━━ FastBERT - contributed the batching inference ┗━━ 📄 Blogs & Papers ┣━━ Atlastic Reputation AI: Four Years of Advancing and Applying a SOTA NLP Classifier ┣━━ Real-World MLOps Examples: Model Development in Hypefactors ┣━━ LangTest: Unveiling & Fixing Biases with End-to-End NLP Pipelines ┣━━ Case Study: MLOps for NLP-powered Media Intelligence using Metaflow ┣━━ Scaling Machine Learning Experiments With neptune.ai and Kubernetes ┗━━ Scaling-up PyTorch inference: Serving billions of daily NLP inferences with ONNX Runtime
I currently work as a Software Engineer at @Dust.
My previous experiences include leading AI developments and setting up entire AI infrastructures at Ava, as well as spearheading MLOps and NLP projects at John Snow Labs. I have engineered multilingual NLP solutions at Hypefactors and conducted deep learning research at Microsoft.
I believe that automating model development and deployment using MLOps enables faster feature releases. To achieve this goal, I have worked with various tools such as PyTorch Lightning, FastAPI, HuggingFace, Kubernetes, ONNXruntime, and more.
Apart from this, I have worked extensively with Deep Learning and Time Series, completing my Master's Thesis on Anomaly Detection in High Dimensional Time Series. Additionally, I am keenly interested in exploring state-of-the-art techniques to speed up the inference of Deep Learning models, especially Transformer-based models.
I am an avid open source contributor and advocate for ethical AI practices.