Skip to content

A notebook containing code to help with reorganizing your content in order to be compatible with BERT for better SEO results.

Notifications You must be signed in to change notification settings

ranksense/Content-Optimization-with-BERT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 

Repository files navigation

Content-Optimization-with-BERT

A notebook containing code to help with reorganizing your content in order to optimize it for BERT to achieve better SEO results.

The Purpose

  • The purpose of this code is to test Bidirectional Encoder Representations of Transformers (BERT) in order to understand how BERT answers questions given context.
  • Furthermore, we can see how the use of embedding helps with analyzing the similarity of words with and without context.

What does it do?

  • Using the text from your webpage as context, the code uses BERT in order to answer any given question pertaining to the text.
  • The output contains the answer, the level of confidence (score), and the starting and ending indices in the source context.
  • Given a list of words or sentences, we can view the level of similarity between the words, with or without context.

Practical usage

  • You can use this code in order to optimize your content for search questions using deep learning.
  • Since Bing and other major search engines are using BERT, you can use the script for SEO on your web pages containing the most valuable information.
  • By using frequently asked questions by your users, you can test whether BERT can extract the best answers from your page.
  • If you run the script on your page and a competing page, you can compare the scores and reorganize the content on your page to optimize it for BERT and gain a competitive edge.

Visit the notebook

About

A notebook containing code to help with reorganizing your content in order to be compatible with BERT for better SEO results.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published