Skip to content

Ankuraxz/Tagonizer-2

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Tagonizer

Made with love in India GitHub

Open Issues Closed Issues Open Pull Requests Closed Pull Requests

Inspiration

Everyone Loves Online Shopping🛒. But when it comes to selecting a product, we are dependent on the true reviews and actual images of the product put up by past customers. On popular E-Commerce websites like Amazon, BestBuy, E-bay, ETSY etc, these reviews, tags and user Images can be misleading, incorrect, sponsored or simply useless😕 .

The Main Problem can be concluded in the following points :

  • Useless or Meaningless Auto-Generated Tags 🏷️
  • Reviews not classified as positive👍 or negative 👎
  • Tags not classified as positive👍 or negative 👎
  • Repeated Tags 🏷️🤔
  • Paid Review Tags that don't reflect sentiments of Customers 😢
  • Misleading Images that don't match the actual product ❓
  • New Buyer or Potential customer had to go through hundreds of reviews, tags and images to judge the product properly 😳

What it does

Tagonizer is an AI-powered Chrome Extension, that the users can open up, on visiting the web-page 🌐of the desired product. Our Extension, will auto fetch all reviews 💬, customer images📷 and will pass them through our AI-engine. It shows up the user, tags autogenerated from reviews and the attached sentiments (Positive/Negative/Moderate), it also classifies the fetched reviews and displays images, accurate with respect to the product.

How we built it

Frontend

  • All the reviews and customer/seller images were fetched from the web-page using Cheerio, and this data was properly bundled and passed to the backend API service.
  • The returned results were re-checked for proper formatting and were shown on the frontend using React.
  • The UI/UX is kept very user friendly, and easy to understand. All important buttons are provided on the first page itself.

Backend

The Chrome Extension calls the Azure API management service, there are 2 different Azure function apps connected to the API, one for text reviews and the other for the customer images

Text Reviews Analytics 💬

  • A list of all reviews is obtained
  • These reviews are passed from multiple layers of data cleaning like removing unnecessary characters, emojis etc.
  • These reviews are passed to Azure Text Cognitive service, and Key-phrases (Tags) and connected sentiments are obtained
  • The output is bundled as JSON and returned back to the frontend

Customer Image Analytics 📷

  • A list of customer image links and seller image links is obtained
  • These image links are passed from multiple data preparation layers like image resizing, URL correction etc.
  • These links are passed to Azure Computer Vision Cognitive service, and categorization of Images with detected objects was obtained
  • Comparision was made between seller Image tags (Correct images), Product Name, genre and Customer Image tags and links of customer images, relevant to the product or its genre was set as output
  • The output is bundled as JSON, and returned back to the frontend

System Design

System Design Implementation

How to Run It

Frontend

  1. Go to Release & Download latest Tagonizer.zip file.
  2. Follow the below Given Illustration for setting up extension on Google Chrome running on MacOS. For other OS, it should be pretty similar.

Demo

Backend

  1. First Clone the repository.
$ git clone https://github.com/ankuraxz/Tagonizer-2.git
  1. Navigate into Cloned Repository.
$ cd Tagonizer
  1. Create Virtual Environment and Activate it.
$ python -m venv venv/
$ source venv/bin/activate
  1. Install Requirements
$ pip install -r requirements.txt
  1. Create an Azure resource for Text Analytics. Afterwards, get the key generated for you to authenticate your requests.
  2. Set Environment Variable KEY, ENDPOINT, LOCATION with secret token/key, endpoint/base-url and location of resource respectively.
  3. Create a Computer Vision resource in the Azure portal to get your key and endpoint. After it deploys, click Go to resource.
  • You will need the key and endpoint from the resource you create to connect your application to the Computer Vision service. You'll paste your key and endpoint into the code below later in the quickstart.
  • You can use the free pricing tier (F0) to try the service, and upgrade later to a paid tier for production.
  • Save them in Environment as VKEY & VENDPOINT for both key and endpoint respectively.
  1. Run the following command to start backend at http://localhost:8000/
$ uvicorn API.main:app --reload --host=0.0.0.0 --port=8000
  1. Open http://localhost:8000/ in browser of your choice. You will be greeted with Swagger UI and further details are present there.

Challenges we ran into

Challenges 🏋️ and Hurdles🚧 are the most important part of any journey, they make us strong by continuously testing our enthusiasm and patience. We too faced many Challenges, while developing Tagonizer Extension, some of them are:

  • Code Collaboration🤝 : Since we are in the middle of a Pandemic, It was hard to collaborate in-person to discuss ideas and develop, hence we had to use tools like Google Meet, Discord, GitHub, UseTogether etc
  • Errors and Bugs🐛 : The Destiny of Every Developer ever, Error and Bugs are our constant companion, and we love to eradicate them, thanks to Stackoverflow, the trustworthy friend
  • Huge Data to Test 📊 : We had the entire Amazon website, in front of us and the challenge was to efficiently fetch the customer reviews, without affecting the site/ extension performance
  • Efficient UI/UX 🖥️ : Users are our 1st Priority, every bit of attention is paid to the UI of the extension, which was a real challenge to achieve
  • Production Errors⚠️: Production Testing is always different than local testing, we too, faced such issues and the last-minute Debugging was really intense
  • Breaking Bad💥: In Production too, our Extension and Backend API broke a few times, due to load, end sequences or unseen bugs, but all of them were corrected.

Accomplishments that we're proud of

  • We hosted the entire Backend using Azure Serverless functionality
  • We have done, vigorous testing and the API is made almost perfect
  • The Backend Code is made very efficient
  • The Frontend was reframed using REACT, making it much easier to manage all components, and increase efficiency in fetching

Updates over the previous version

Entire Project has been Refactored

  • Reframed the FrontEnd using REACT instead of HTML, CSS, making it much faster and efficient in terms of UI/UX and data fetching
  • Shifted the Entire Backend from Heroku to Azure serverless (with API Management), reduced the latency from 48 sec to approx 400 ms
  • Introduced Logging, and efficient error handling techniques

What we learned

  • REACT
  • Azure Functions
  • Azure API Management
  • Azure App service

What's next for Tagonizer

We will be releasing a public version, after a bit more testing. Any user can then go and use the extension after authentication.