Skip to content

Commit

Permalink
Search (#223)
Browse files Browse the repository at this point in the history
* initial data and consts

* grammar file WIP + consts

* improve 'downloads' grammar for 'he'

* filter_results as a single name for filter rules

* restore rule name

* hasTextVar recognition and GrammarRulePercolatorQuery struct

* Percolator index settings (make.py+assets)

* GrammarRuleWithPercolatorQuery progress

* fixing index of percolator fields

* Free text variable

* createPerculateQuery wip

* perculate search (with highlight)

* detect text var value

* Ignore proceeding if $Text is not 1

* progress

* $ContentType variables and mapping

* Index Hebrew Rabash Assorted Notes with number

* complet index with Letter

* Complete

* grammar filtered results search + refactoring

* sources var value

* boost filtered by grammar score

* results language mapping on grammar search

* SearchGrammarsV2 fix

* grammar fixes

* grammars recover fix

* VAR_TEXT log

* grammar $text fixes and refactoring

* Completed plus

* content_type filter

* VariableMapToFilterValues fix

* remove spaces

* NewFilteredResultsSearchRequest error handling

* fix retrieveTextVarValues

* remove Explain from NewResultsSuggestGrammarV2CompletionRequest

* critical fix on assigment of values to variables in grammar

* fix ES_SRC_PARENTS_FOR_CHAPTER_POSITION_INDEX_LETTER

* synonym for Rabash Assorted Notes

* fix synonym (set tab instead space)

* Incr. filtered by grammar MaxScore

* grammar filter search - progress and fixes

* meal var value

* rt for sourceRequests

* remove tweets search from filter grammar

* grammar engine refactoring

* highlight for grammar filtered search

* fix grammar suggest search

* grammar - heb test for hey letter

* restore prev commit

* refactoring

* fix sources index test

* map geresh symbols to apostrophe

* filter.grammar rules on russian and spanish

* content_type var in spanish and russian

* Ignore grammar for known source titles

* grammar filtered scores logic

* fix landing page intents

* CT vars: virtual_lessons, women_lessons (no spanish)

* small refactoring

* fix when we do not have regular results

* ver. incr.

* search without synonyms for grammar percolate

* FILTERED_BY_GRAMMAR_SCORE_INCREMENT = 100

* add title suggest weight = 40

* add title suggest weight = 40

* grammar filtered scores logic

* cancel PerculateQuery on exact terms

* additional variable values for sources and meals in hebrew

* books_titles + he fixes in content_type.variable

* new grammar boost logic

* content_type variables file fixes

* move suggest to right place

* suggest weight for ptiha

* Save deb parameter

* Correct

* consider global max score from filtered results

* remove space

* incr pticha autocomplete weight to 250

* undo changes

* comment

* remove commented out code

* comment

* comment

* comment

* version incr.

* comment

* comment

* whitelist indexing CT_LESSONS_SERIES

* Add grammar search latency to log

* avoid cases with more than 1 filter intent

* fix error message

* version incr.

* support of grammar filtered search in intents engine

* seperate grammars logic to allow the return of filter intents before searching for filtered results

* typo suggest - consider grammar free text

* filter intents small refactoring and fixes

* fixes

* consider grammar CT for intent types

* Intents carousel according to grammar - fixes

* INTENTS_SEARCH_BY_FILTER_GRAMMAR_COUNT

* more content_type var values for he,lessons

* FILTER_GRAMMAR_INCREMENT_FOR_MATCH_CT_AND_FULL_TERM only above 5

* avoid searching full term in grammar filter search if CT is articles

* filtered results scores twick

* landing page heb daily kabbalah lesson value

* temp remove of content_type heb daily kabbalah lesson value

* fix GrammarVariablesMatch validation for filter intents

* update hebrew vars

* IntentSearchOptions (refactoring)

* comments

* ver incr

* grammar filter by source - progress

* filter by source progress

* sources grammar - carousel support WIP

* assign zero score for filtered hits that duplicate carousels results

* includeTypedUidsFromContentUnits

* adding he definite article 'the'

* adding 'writings' to books_titles

* remove the word 'peace' from Rabash synonym

* Revert "remove the word 'peace' from Rabash synonym"

This reverts commit 3054def.

* commented out log for debug

* Tream double single quotes as double quotes.

* Bring better landing pages by filtering duplications by collection

* comment

* comment

* ver. incr.

* LoadSourceNameTranslationsFromDB and combine results with translations from file

* set 'field' for percolatorQuery

* by_source rule progress

* additional source variables

* classification grammar by source

* cancel  carousel  for source+free text

* GRAMMAR_INTENT_CLASSIFICATION_BY_CONTENT_TYPE_AND_SOURCE progress

* GRAMMAR_INTENT_CLASSIFICATION_BY_CONTENT_TYPE_AND_SOURCE in GRAMMAR_INTENTS_TO_FILTER_VALUES

* assignedRulesSuggest no Suffixes

* GrammarVariablesMatch check for by_content_type_and_source rule

* fix GrammarVariablesMatch for GRAMMAR_INTENT_CLASSIFICATION_BY_CONTENT_TYPE_AND_SOURCE

* GRAMMAR_PERCULATE_SIZE = 5

* zohar source variables

* grammar scoring fixes for classification intents and misc

* disable intents engine if classification intents are returned from grammar

* avoid duplicates in classification intents from grammar engine

* classification.grammar fixes

* esacping q. in grammar index

* select intent with max score

* remove duplicate

* source variables

* article variable with :

* more source grammar filter variables and rules

* GRAMMAR_INTENT_FILTER_BY_SOURCE rule is not triggered with section filters

* Classification intents - combine and normalize results from GrammarEngine and IntentsEngine

* boostClassificationScore changes

* volume synonym

* volume synonym fix

* explanation to ClassificationIntent

* disable GRAMMAR_INTENT_CLASSIFICATION_BY_CONTENT_TYPE_AND_SOURCE

* avoid setting currentLang of empty results

* spanish grammar and variable data for by_source rule

* remove consts.GRAMMAR_INTENT_CLASSIFICATION_BY_CONTENT_TYPE_AND_SOURCE

* fix the remove of hits from 'filter grammar' that duplicates carousels source

* Introduction to The Study of the Ten Sefirot

* CONTENT_TYPE_INTENTS_BOOST

* GRAMMAR_INTENT_BY_POSITION WIP

* GRAMMAR_INTENT_FILTER_BY_SOURCE_AND_POSITION WIP

* grammar for position and position type - progress

* grammar small refactoring

* rename variable $PositionType to $DivisionType

* position var fixes

* Load source translations - Filter out Rabash Assorted Notes

* fix source variables SQL query

* remove by_source_and_position

* add heb. volume variable value

* fix typo

* disable suggest for GRAMMAR_INTENT_SOURCE_POSITION_WITHOUT_TERM

* sourcePathFromSql

* RB+BS Eng in source variables

* var values for articles and letters

* language fix in source.variable

* Rabash assorted notes source variable values

* remove 'note' from division type

* volume ru division_type

* GRAMMAR_INTENT_SOURCE_POSITION_WITHOUT_TERM progress

* fix source variable

* GRAMMAR_INTENT_SOURCE_POSITION_WITHOUT_TERM source fix

* article word  in ukraine

* remove Rabash articles from source.variable

* Remove Rabash Letters from source.variable

* sourcePathFromSql with leafPrefixType

* GRAMMAR_INTENT_SOURCE_POSITION_WITHOUT_TERM fix

* list of sources that will not be included in SourcesByPositionAndParent map

* fix for grammar filter by source

* Disable 'by content type' priorty boost if the query contains a number

* Tfilat Rabim article more common spelling

* fix loadSourcesByPositionAndParent

* sourcePathFromSql - attempt with default language (He)

* Allow only single sourcesPositionWithoutTerm

* not trigger grammar if the query equals to a value from source variables

* remove loadSourceTitlesWithMoreThanOeWord

* getSingleHitIntentsBySource when query term == source name

* allow grammar if term is author

* Return Library Landing Page for author names

* improve author recognition in SearchGrammarsV2

* check if term identical to source without quotes

* Fix logic of 'by content type' priorty boost

* If term identical to a name of author. Search only for Landing Pages grammar

* Allow LP intents for terms that identical to source names (not only authors)

* fix log message

* fix log

* GRAMMAR_INTENT_SOURCE_POSITION_WITHOUT_TERM fix

* New TES value for RU source variable

* Allow new life result to be on 4th place

* lower classification intents score in getSingleHitIntentsBySource to display the source above them

* naming

* CONTENT_TYPE_INTENTS_BOOST = 4

* getSingleHitIntentsBySource according to section filter

* fix source name translations query

* source var fixes

* fix LoadSourceNameTranslationsFromDB SQL query

* remove commented code

* section filter support for "source position without term" rule

* add SRC_CONNECTING_TO_THE_SOURCE to SOURCE_PARENTS_NOT_TO_INCLUDE_IN_VARIABLE_VALUES

* change var names

* fix error func name in message

* comment

* comment

* better comment

* remove comment

* using consts for source types

* remove commented code

* comments

* comments

* grammar request refactoring

Co-authored-by: LAPTOP-NFLD56CB\Yuri <yurihechter@gmai.com>
Co-authored-by: Evgeny_v <gen.vinnikov@gmail.com>
Co-authored-by: davgur <gur28davaravut>
Co-authored-by: bbfsdev <bbfsdev@gmail.com>
  • Loading branch information
4 people committed Feb 25, 2021
1 parent f326d3a commit ca0e8b4
Show file tree
Hide file tree
Showing 16 changed files with 1,129 additions and 202 deletions.
92 changes: 71 additions & 21 deletions cache/search_stats.go
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,8 @@ import (
"database/sql"
"encoding/json"
"fmt"
"strconv"
"strings"

"github.com/Bnei-Baruch/archive-backend/es"

Expand Down Expand Up @@ -142,16 +144,18 @@ type SearchStatsCache interface {
DoesHolidayExist(holiday string, year string) bool
DoesHolidaySingle(holiday string, year string) bool

DoesSourceTitleWithMoreThanOneWordExist(title string) bool
// Some of the sources (consts.NOT_TO_INCLUDE_IN_SOURCE_BY_POSITION) are restricted from these functions so you should not use them for general porpuses.
GetSourceByPositionAndParent(parent string, position string, sourceTypeIds []int64) *string
GetSourceParentAndPosition(source string, getSourceTypeIds bool) (*string, *string, []int64, error)
}

type SearchStatsCacheImpl struct {
mdb *sql.DB
tags ClassByTypeStats
sources ClassByTypeStats
conventions map[string]map[string]int
holidayYears map[string]map[string]int
sourceTitles map[string]bool
mdb *sql.DB
tags ClassByTypeStats
sources ClassByTypeStats
conventions map[string]map[string]int
holidayYears map[string]map[string]int
sourcesByPositionAndParent map[string]string
}

func NewSearchStatsCacheImpl(mdb *sql.DB) SearchStatsCache {
Expand Down Expand Up @@ -194,9 +198,46 @@ func (ssc *SearchStatsCacheImpl) IsSourceWithEnoughUnits(uid string, count int,
return ssc.isClassWithUnits("sources", uid, count, cts...)
}

func (ssc *SearchStatsCacheImpl) DoesSourceTitleWithMoreThanOneWordExist(title string) bool {
_, exist := ssc.sourceTitles[title]
return exist
func (ssc *SearchStatsCacheImpl) GetSourceByPositionAndParent(parent string, position string, sourceTypeIds []int64) *string {
if sourceTypeIds == nil || len(sourceTypeIds) == 0 {
sourceTypeIds = consts.ALL_SRC_TYPES
}
for typeId := range sourceTypeIds {
// Key structure: parent of the requested source (like book name) - position of the requested source child (like chapter or part number) - source type (book, volume, article, etc...)
key := fmt.Sprintf("%v-%v-%v", parent, position, typeId)
if src, ok := ssc.sourcesByPositionAndParent[key]; ok {
return &src
}
}
return nil
}

func (ssc *SearchStatsCacheImpl) GetSourceParentAndPosition(source string, getSourceTypeIds bool) (*string, *string, []int64, error) {
var parent *string
var position *string
typeIds := []int64{}
// If a common usage for this function is needed, it is better to optimize it by managing a reverse map.
for k, v := range ssc.sourcesByPositionAndParent {
if v == source {
s := strings.Split(k, "-")
if parent == nil {
parent = &s[0]
}
if position == nil {
position = &s[1]
}
typeIdStr := s[2]
if !getSourceTypeIds {
break
}
typeId, err := strconv.ParseInt(typeIdStr, 10, 64)
if err != nil {
return nil, nil, []int64{}, err
}
typeIds = append(typeIds, typeId)
}
}
return parent, position, typeIds, nil
}

func (ssc *SearchStatsCacheImpl) isClassWithUnits(class, uid string, count int, cts ...string) bool {
Expand Down Expand Up @@ -239,11 +280,10 @@ func (ssc *SearchStatsCacheImpl) Refresh() error {
if err != nil {
return errors.Wrap(err, "Load holidays stats.")
}
ssc.sourceTitles, err = ssc.loadSourceTitlesWithMoreThanOeWord()
ssc.sourcesByPositionAndParent, err = ssc.loadSourcesByPositionAndParent()
if err != nil {
return errors.Wrap(err, "Load source titles with more than one word.")
return errors.Wrap(err, "Load source max position.")
}

return nil
}

Expand Down Expand Up @@ -403,22 +443,32 @@ group by s.id, cu.type_id;`).Query()
return tags.flatten(), sources.flatten(), nil
}

func (ssc *SearchStatsCacheImpl) loadSourceTitlesWithMoreThanOeWord() (map[string]bool, error) {
rows, err := queries.Raw(ssc.mdb, `select distinct sn.name from sources s
join source_i18n sn on sn.source_id=s.id
where (length(sn.name) - length(replace(sn.name, ' ', ''))) > 0`).Query()
func (ssc *SearchStatsCacheImpl) loadSourcesByPositionAndParent() (map[string]string, error) {
queryMask := `select p.uid as parent_uid, c.uid as source_uid, c.position, c.type_id from sources p
join sources c on c.parent_id = p.id
where c.position is not null and p.uid not in (%s)`
notToInclude := []string{}
for _, s := range consts.NOT_TO_INCLUDE_IN_SOURCE_BY_POSITION {
notToInclude = append(notToInclude, fmt.Sprintf("'%s'", s))
}
query := fmt.Sprintf(queryMask, strings.Join(notToInclude, ","))
rows, err := queries.Raw(ssc.mdb, query).Query() // Authors are not part of the query.
if err != nil {
return nil, errors.Wrap(err, "queries.Raw")
}
defer rows.Close()
ret := map[string]bool{}
ret := map[string]string{}
for rows.Next() {
var name string
err = rows.Scan(&name)
var parent_uid string // uid of parent source
var source_uid string // uid of child source
var position int // position of child source
var type_id int64 // type of child source
err = rows.Scan(&parent_uid, &source_uid, &position, &type_id)
if err != nil {
return nil, errors.Wrap(err, "rows.Scan")
}
ret[name] = true
key := fmt.Sprintf("%v-%v-%v", parent_uid, position, type_id)
ret[key] = source_uid
}
return ret, nil
}
85 changes: 74 additions & 11 deletions consts/consts.go
Original file line number Diff line number Diff line change
Expand Up @@ -128,6 +128,10 @@ var ALL_KNOWN_LANGS = [...]string{
LANG_UKRAINIAN, LANG_AMHARIC,
}

var ALL_SRC_TYPES = []int64{
SRC_TYPE_COLLECTION, SRC_TYPE_BOOK, SRC_TYPE_VOLUME, SRC_TYPE_PART, SRC_TYPE_PARASHA, SRC_TYPE_CHAPTER, SRC_TYPE_ARTICLE, SRC_TYPE_TITLE, SRC_TYPE_LETTER, SRC_TYPE_ITEM,
}

var SRC_TYPES_FOR_TITLE_DESCRIPTION_CONCAT = map[int64]bool{
SRC_TYPE_VOLUME: true,
SRC_TYPE_PART: true,
Expand Down Expand Up @@ -331,6 +335,7 @@ const (
// Consider making a carusele and not limiting.
MAX_MATCHES_PER_GRAMMAR_INTENT = 3
FILTER_GRAMMAR_INCREMENT_FOR_MATCH_CT_AND_FULL_TERM = 200
CONTENT_TYPE_INTENTS_BOOST = 4.0 // For priority between several filter intent types
)

const (
Expand Down Expand Up @@ -453,6 +458,13 @@ var ES_INTENT_SUPPORTED_FILTERS = map[string]bool{
FILTER_SOURCE: true,
}

// If these filters present, we automatically add some search results when the search term is identical to source name.
var AUTO_INTENTS_BY_SOURCE_NAME_SUPPORTED_FILTERS = map[string]bool{
FILTERS[FILTER_UNITS_CONTENT_TYPES]: true,
FILTERS[FILTER_COLLECTIONS_CONTENT_TYPES]: true,
FILTERS[FILTER_SECTION_SOURCES]: true,
}

var ES_INTENT_SUPPORTED_CONTENT_TYPES = map[string]bool{
CT_LESSON_PART: true,
CT_LECTURE: true,
Expand Down Expand Up @@ -512,12 +524,16 @@ var INTENT_HIT_TYPE_BY_CT = map[string]string{
const (
GRAMMAR_INDEX = "grammar"

GRAMMAR_TYPE_FILTER = "filter"
GRAMMAR_TYPE_LANDING_PAGE = "landing-page"
GRAMMAR_TYPE_FILTER = "filter"
GRAMMAR_TYPE_LANDING_PAGE = "landing-page"
GRAMMAR_TYPE_CLASSIFICATION = "classification"

GRAMMAR_INTENT_FILTER_BY_CONTENT_TYPE = "by_content_type"
GRAMMAR_INTENT_FILTER_BY_CONTENT_TYPE = "by_content_type"
GRAMMAR_INTENT_FILTER_BY_SOURCE = "by_source"
GRAMMAR_INTENT_SOURCE_POSITION_WITHOUT_TERM = "source_position_without_term"

GRAMMAR_LP_SINGLE_COLLECTION = "grammar_landing_page_single_collection_from_sql"
GRAMMAR_GENERATED_SOURCE_HIT = "grammar_generated_source_hit"

GRAMMAR_INTENT_LANDING_PAGE_LESSONS = "lessons"
GRAMMAR_INTENT_LANDING_PAGE_VIRTUAL_LESSONS = "virtual_lessons"
Expand Down Expand Up @@ -619,9 +635,18 @@ var GRAMMAR_INTENTS_TO_FILTER_VALUES = map[string]map[string][]string{
GRAMMAR_INTENT_LANDING_PAGE_DOWNLOADS: nil,
GRAMMAR_INTENT_LANDING_PAGE_HELP: nil,

GRAMMAR_INTENT_SOURCE_POSITION_WITHOUT_TERM: map[string][]string{
FILTERS[FILTER_SECTION_SOURCES]: []string{""},
FILTERS[FILTER_UNITS_CONTENT_TYPES]: []string{CT_LESSON_PART, CT_FULL_LESSON, CT_VIDEO_PROGRAM_CHAPTER},
FILTERS[FILTER_COLLECTIONS_CONTENT_TYPES]: []string{CT_DAILY_LESSON, CT_VIDEO_PROGRAM},
},

// Filters

GRAMMAR_INTENT_FILTER_BY_CONTENT_TYPE: nil,

// Currently this rule is not triggered with section filters. Consider to enable combination of sections + rule filter.
GRAMMAR_INTENT_FILTER_BY_SOURCE: nil,
}

const (
Expand All @@ -633,8 +658,11 @@ const (
VAR_TEXT = "$Text"
VAR_HOLIDAYS = "$Holidays"
VAR_CONTENT_TYPE = "$ContentType"
VAR_SOURCE = "$Source"
VAR_POSITION = "$Position"
VAR_DIVISION_TYPE = "$DivType"

// $ContentType variables
// $ContentType variable values

VAR_CT_PROGRAMS = "programs"
VAR_CT_ARTICLES = "articles"
Expand All @@ -655,6 +683,14 @@ const (
VAR_CT_HOLIDAYS = "holidays"
VAR_CT_CONVENTIONS = "conventions"'
*/

// $DivisionType variable values

VAR_DIV_ARTICLE = "article"
VAR_DIV_CHAPTER = "chapter"
VAR_DIV_VOLUME = "volume"
VAR_DIV_PART = "part"
VAR_DIV_NUMBER = "number"
)

// Grammar $ContentType variables to content type filters mapping.
Expand Down Expand Up @@ -708,6 +744,8 @@ var VARIABLE_TO_FILTER = map[string]string{
VAR_TEXT: "text",
VAR_HOLIDAYS: "holidays",
VAR_CONTENT_TYPE: "content_type",
VAR_SOURCE: "source",
VAR_POSITION: "position",
}

// Latency log
Expand All @@ -725,6 +763,7 @@ const (
LAT_GETSUGGESTIONS_MULTISEARCHDO = "GetSuggestions.MultisearchDo"
LAT_DOSEARCH_GRAMMARS_MULTISEARCHGRAMMARSDO = "DoSearch.SearchGrammars.MultisearchGrammarsDo"
LAT_DOSEARCH_GRAMMARS_MULTISEARCHFILTERDO = "DoSearch.SearchGrammars.MultisearchFilterDo"
LAT_DOSEARCH_GRAMMARS_RESULTSTOINTENTS = "DoSearch.SearchGrammars.ResultsToIntents"
)

var LATENCY_LOG_OPERATIONS_FOR_SEARCH = []string{
Expand All @@ -741,13 +780,19 @@ var LATENCY_LOG_OPERATIONS_FOR_SEARCH = []string{
}

const (
SRC_SHAMATI = "qMUUn22b"
SRC_NONE_ELSE_BESIDE_HIM = "hFeGidcS"
SRC_PEACE_ARCTICLE = "28Cmp7gl"
SRC_PEACE_IN_WORLD_ARTICLE = "hqUTKcZz"
SRC_ARVUT_ARTICLE = "itcVAcFn"
SRC_RABASH_ASSORTED_NOTES = "2GAdavz0"
SRC_THE_ROSE_ARTICLE = "yUcfylRm"
SRC_SHAMATI = "qMUUn22b"
SRC_NONE_ELSE_BESIDE_HIM = "hFeGidcS"
SRC_PEACE_ARCTICLE = "28Cmp7gl"
SRC_PEACE_IN_WORLD_ARTICLE = "hqUTKcZz"
SRC_ARVUT_ARTICLE = "itcVAcFn"
SRC_RABASH_ASSORTED_NOTES = "2GAdavz0"
SRC_THE_ROSE_ARTICLE = "yUcfylRm"
SRC_LETTERS_RABASH = "b8SHlrfH"
SRC_ARTICLES_RABASH = "rQ6sIUZK"
SRC_ARTICLES_BAAL_SULAM = "qMeV5M3Y"
SRC_BAAL_SULAM_ARTICLES_LETTERS_SUMMARIES = "QUBP2DYe"
SRC_BAAL_SULAM_WRITINGS_CAMPUS_RU = "8Y0f8Jg9"
SRC_CONNECTING_TO_THE_SOURCE = "wWm6fbn4"
)

var ES_SUGGEST_SOURCES_WEIGHT = map[string]float64{
Expand Down Expand Up @@ -777,3 +822,21 @@ var ES_SRC_PARENTS_FOR_CHAPTER_POSITION_INDEX = map[string]PositionIndexType{
SRC_SHAMATI: LETTER_IF_HEBREW,
SRC_RABASH_ASSORTED_NOTES: ALWAYS_NUMBER,
}

var ES_GRAMMAR_DIVT_TYPE_TO_SOURCE_TYPES = map[string][]int64{
VAR_DIV_ARTICLE: []int64{SRC_TYPE_ARTICLE},
VAR_DIV_CHAPTER: []int64{SRC_TYPE_CHAPTER, SRC_TYPE_ARTICLE, SRC_TYPE_LETTER},
VAR_DIV_VOLUME: []int64{SRC_TYPE_VOLUME},
VAR_DIV_PART: []int64{SRC_TYPE_PART},
VAR_DIV_NUMBER: []int64{SRC_TYPE_CHAPTER, SRC_TYPE_ARTICLE, SRC_TYPE_LETTER},
}

var NOT_TO_INCLUDE_IN_SOURCE_BY_POSITION = []string{
SRC_LETTERS_RABASH, SRC_ARTICLES_RABASH, SRC_ARTICLES_BAAL_SULAM, // Children 'position' value of these sources are not according to their actual chapter
}

// We avoid adding source names from Rabash Assorted notes because many of them are similar to concepts or topics and less known as names of Rabash sources.
// Also we avoid adding names of article summaries and campus material to avoid confusion with the original sources.
var SOURCE_PARENTS_NOT_TO_INCLUDE_IN_VARIABLE_VALUES = []string{
SRC_RABASH_ASSORTED_NOTES, SRC_BAAL_SULAM_ARTICLES_LETTERS_SUMMARIES, SRC_BAAL_SULAM_WRITINGS_CAMPUS_RU, SRC_CONNECTING_TO_THE_SOURCE,
}
5 changes: 4 additions & 1 deletion es/synonyms/en.txt
Original file line number Diff line number Diff line change
Expand Up @@ -111,4 +111,7 @@ chanukah hanukkah hanuka chanuka

#Added by Eran at 12.06.20 (not in spreadsheet)
extend revealed
haim hadashim haim-hadashim new life
haim hadashim haim-hadashim new life

#Added at 19.02.20 (not in spreadsheet)
volume vol
2 changes: 1 addition & 1 deletion search/data/he.recall.csv
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ Language,Query,Weight,Bucket,#1,#2,#3,#4,#5,Comment,Grammar queries,P/A/O
he,542,1443,מספר תכנית (חיים חדשים),https://kabbalahmedia.info/he/programs/cu/AsNLozeK,,,,,,,pragmatic
he,707,1443,מספר תכנית (חיים חדשים),https://kabbalahmedia.info/he/programs/cu/fMJtiizU,,,,,,,pragmatic
he,103,1443,מספר תכנית (חיים חדשים),https://kabbalahmedia.info/he/programs/c/l4WQAsm9,https://kabbalahmedia.info/he/programs/cu/VrwG8S4o,,,,"All results are 103fm, should be more various",,pragmatic
he,209,1443,מספר תכנית (חיים חדשים),https://kabbalahmedia.info/he/programs/cu/g0s6w9ze,,,,,,,pragmatic
he,209,1443,מספר תכנית (חיים חדשים),,,,https://kabbalahmedia.info/he/programs/cu/g0s6w9ze,,,,pragmatic
he,פתיחה,474.2,מאמר בקבלה,https://kabbalahmedia.info/he/sources/kB3eD83I,,https://kabbalahmedia.info/he/lessons/series/c/XZoflItG,https://kabbalahmedia.info/he/lessons/series/c/Y4ybN1qM,https://kabbalahmedia.info/he/lessons/series/c/yoLixtE4,"If date older then week, article goes first.",,pragmatic
he,הקדמה לספר הזוהר,474.2,מאמר בקבלה,https://kabbalahmedia.info/he/sources/ALlyoveA,,https://kabbalahmedia.info/he/lessons/series/c/MTFgU93m,https://kabbalahmedia.info/he/lessons/series/c/Si44p2MQ,,unclear which one is the correct....,,pragmatic
he,מתן תורה,474.2,מאמר בקבלה,https://kabbalahmedia.info/he/sources/2bscFWf4,,https://kabbalahmedia.info/he/lessons/series/c/aMqw2MH4?language=he,https://kabbalahmedia.info/he/lessons/series/c/ZuAHvIQ5?language=he,,,,pragmatic
Expand Down

0 comments on commit ca0e8b4

Please sign in to comment.