BLIP-ImageCaption
-
Updated
Jan 21, 2024 - Python
BLIP-ImageCaption
A text generation library to paraphrase image captions using back translations or transfer learning.
PyTorch implementation of Image captioning with Bottom-up, Top-down Attention
A simple toolkit to transform datasource generate by img2dataset from parquet file to Huggingface dataset.
Marshall: Modality-Agnostic Representation learning by SHAred pre-training of muLtiple modaLities
NeuralTalk is a Python+numpy project for learning Multimodal Recurrent Neural Networks that describe images with sentences.
Pytorch Image-caption retrieval model
A Mindspore Implementation of "Show, Attend and Tell: Neural Image Caption Generation with Visual Attention".
This is an innovative project aimed at enhancing the visual experience for individuals with impairments. Leveraging machine learning and natural language processing, this repository houses the codebase for generating efficient and coherent natural language descriptions of captured images. The project integrates seamlessly with image recognition,
A Mindspore Implementation of paper "Show and Tell : Neural Image Caption Generation"
Image captioning project.
PyTorch implementation of image captioning based on attention mechanism
Image Captioning with Google‘s NIC For AI Challenger
End to End Deep learning model that generate image captions
This repository reimplements "Show, Attend and Tell" model and add extra deep learning techniques.
Image captioning model with Resnet50 encoder and LSTM decoder
a py3 lib for NLP & image-caption metrics : BLEU METEOR CIDEr ROUGE SPICE WMD
Add a description, image, and links to the image-caption topic page so that developers can more easily learn about it.
To associate your repository with the image-caption topic, visit your repo's landing page and select "manage topics."