Skip to content

Commit

Permalink
Search (#216)
Browse files Browse the repository at this point in the history
* span near query - not in order for full_title

* intents - search in title only

* source index - concat description with  leaf title

* clear scroll sessions

* SRC_TYPES_FOR_TITLE_DESCRIPTION_CONCAT map

* add example to comment

* sources - index title in desc

* append description search for sources intents

* using uid instead of id for source consts

* const change ES_SOURCES_SUGGEST_DEFAULT_WEIGHT

* comment

* backtick strings

* creat queries error handling

* comment

* const

* comments

* Middle changes

* createSpanNearQuery comment

* sql comment

* holidayVariable WIP

* commit

* Variable consts

* wip

* loading holidays translations from db

* empty map fix

* holidays grammar variables

* fix holiday vars loading

* add $Year to holiday grammar

* holiday translations - uid as key

* 2

* 3

* 4

* remove $Year from grammar

* WiP

* for 4 and 5 make SQ_GOOD if have on first page

* 5

* 6

* 7

* 8

* remove of adding author to shamati subtitle + validating full_names instead of names

* add language to response

* restore years in holidays grammar

* omit not exited holiday years

* refreshHolidayYears fix

* refreshHolidayYears logs

* remove logs

* fix SQL in refreshHolidayYears

* fix response to client nativize on GRAMMAR_TYPE_LANDING_PAGE

* fix refreshHolidayYears SQL query

* ignore index suggest for GRAMMAR_INTENT_LANDING_PAGE_HOLIDAYS

* addSuggest fix

* suggest only for 'holidays $Year'

* fix tokenize

* tokenizer small refactoring + more tests

* middle changes in Add_tests_for_indexing_sources_with_position

* Complete branch

* HolidaysLandingPageToCollectionHit

* refactoring

* remove duplicate results, refator HolidaysLandingPageToCollectionHit

* Complete PR

* Completed

* DoesHolidaySingle

* ConventionsLandingPageToCollectionHit + fixes in HolidaysLandingPageToCollectionHit

* fix in refreshConventions and ConventionsLandingPageToCollectionHit

* fix HolidaysLandingPageToCollectionHit

* comment log

* sepcific holiday LP

* ParseExpectation - support holidays

* fix results uniqueness

* commit

* 2

* 3

* 4

* 5

* 6

* 7

* 8

* add language to response

* fix tokenize

* tokenizer small refactoring + more tests

* set good just when it till 5 place

* add comment

* rabash index

* change excpectation from LP to collection

* avoid index of holidays collections

* suggest changes

* IndexGrammars suggest fix for hilidays

* IndexGrammars suggest holidays fix

* holidays grammar - remove condition for suggest

* ES_SRC_ADD_MAAMAR_TO_SUGGEST - Add HaShoshana article

* TestSourcesIndex refactoring

* email for index error reporting

* DumpIndexes before validating position is indexed in source full name

* comment out additional reindex

* assign id and uid to newly created test source

* restore

* DumpDB

* sources test fix and refactoring

* debug data

* remove dump

* restore dump

* required RefreshAll for some elastic versions (in sources index test)

* ver incr.

* uniqueHitsByMdbUid - refactore to a function and test

* remove MDB_ID from test Source

* ver incr.

* initial data and consts

* grammar file WIP + consts

* improve 'downloads' grammar for 'he'

* filter_results as a single name for filter rules

* restore rule name

* hasTextVar recognition and GrammarRulePercolatorQuery struct

* Percolator index settings (make.py+assets)

* GrammarRuleWithPercolatorQuery progress

* fixing index of percolator fields

* Free text variable

* createPerculateQuery wip

* perculate search (with highlight)

* detect text var value

* Ignore proceeding if $Text is not 1

* progress

* $ContentType variables and mapping

* Index Hebrew Rabash Assorted Notes with number

* complet index with Letter

* Complete

* grammar filtered results search + refactoring

* sources var value

* boost filtered by grammar score

* results language mapping on grammar search

* SearchGrammarsV2 fix

* grammar fixes

* grammars recover fix

* VAR_TEXT log

* grammar $text fixes and refactoring

* Completed plus

* content_type filter

* VariableMapToFilterValues fix

* remove spaces

* NewFilteredResultsSearchRequest error handling

* fix retrieveTextVarValues

* remove Explain from NewResultsSuggestGrammarV2CompletionRequest

* critical fix on assigment of values to variables in grammar

* fix ES_SRC_PARENTS_FOR_CHAPTER_POSITION_INDEX_LETTER

* synonym for Rabash Assorted Notes

* fix synonym (set tab instead space)

* Incr. filtered by grammar MaxScore

* grammar filter search - progress and fixes

* meal var value

* rt for sourceRequests

* remove tweets search from filter grammar

* grammar engine refactoring

* highlight for grammar filtered search

* fix grammar suggest search

* grammar - heb test for hey letter

* restore prev commit

* refactoring

* fix sources index test

* map geresh symbols to apostrophe

* filter.grammar rules on russian and spanish

* content_type var in spanish and russian

* Ignore grammar for known source titles

* grammar filtered scores logic

* fix landing page intents

* CT vars: virtual_lessons, women_lessons (no spanish)

* small refactoring

* fix when we do not have regular results

* ver. incr.

* search without synonyms for grammar percolate

* FILTERED_BY_GRAMMAR_SCORE_INCREMENT = 100

* add title suggest weight = 40

* add title suggest weight = 40

* grammar filtered scores logic

* cancel PerculateQuery on exact terms

* additional variable values for sources and meals in hebrew

* books_titles + he fixes in content_type.variable

* new grammar boost logic

* content_type variables file fixes

* move suggest to right place

* suggest weight for ptiha

* Save deb parameter

* Correct

* consider global max score from filtered results

* remove space

* incr pticha autocomplete weight to 250

* undo changes

* comment

* remove commented out code

* comment

* comment

* comment

* version incr.

* comment

* comment

* whitelist indexing CT_LESSONS_SERIES

* Add grammar search latency to log

* avoid cases with more than 1 filter intent

* fix error message

* version incr.

* support of grammar filtered search in intents engine

* seperate grammars logic to allow the return of filter intents before searching for filtered results

* typo suggest - consider grammar free text

* filter intents small refactoring and fixes

* fixes

* consider grammar CT for intent types

* Intents carousel according to grammar - fixes

* INTENTS_SEARCH_BY_FILTER_GRAMMAR_COUNT

* more content_type var values for he,lessons

* FILTER_GRAMMAR_INCREMENT_FOR_MATCH_CT_AND_FULL_TERM only above 5

* avoid searching full term in grammar filter search if CT is articles

* filtered results scores twick

* landing page heb daily kabbalah lesson value

* temp remove of content_type heb daily kabbalah lesson value

* fix GrammarVariablesMatch validation for filter intents

* update hebrew vars

* IntentSearchOptions (refactoring)

* comments

* ver incr

Co-authored-by: LAPTOP-NFLD56CB\Yuri <yurihechter@gmai.com>
Co-authored-by: Evgeny_v <gen.vinnikov@gmail.com>
Co-authored-by: Eran Minuchin <EranMinuchin@gmail.com>
Co-authored-by: davgur <gur28davaravut>
Co-authored-by: emarchi <archivecabal@yahoo.com>
  • Loading branch information
5 people committed Nov 7, 2020
1 parent 47bbce8 commit 988cbbd
Show file tree
Hide file tree
Showing 10 changed files with 275 additions and 109 deletions.
4 changes: 2 additions & 2 deletions cmd/eval.go
Original file line number Diff line number Diff line change
Expand Up @@ -350,7 +350,7 @@ func testTypoSuggestFn(cmd *cobra.Command, args []string) {

for _, t := range typos {
query := search.Query{Term: t, LanguageOrder: consts.SEARCH_LANG_ORDER[language]}
res, err := engine.GetTypoSuggest(query)
res, err := engine.GetTypoSuggest(query, nil)
utils.Must(err)
if res.Valid {
log.Infof("Suggest for '%s' is: '%s'.", t, res.String)
Expand All @@ -366,7 +366,7 @@ func testTypoSuggestFn(cmd *cobra.Command, args []string) {

query := search.Query{Term: e.Query, LanguageOrder: consts.SEARCH_LANG_ORDER[language]}

res, err := engine.GetTypoSuggest(query)
res, err := engine.GetTypoSuggest(query, nil)
utils.Must(err)
if res.Valid {
log.Infof("Suggest for '%s' is: '%s'. Check if this is false positive.", e.Query, res.String)
Expand Down
32 changes: 31 additions & 1 deletion consts/consts.go
Original file line number Diff line number Diff line change
Expand Up @@ -320,7 +320,8 @@ var LANG2CODE = map[string]string{
// api

const (
INTENTS_SEARCH_COUNT = 10
INTENTS_SEARCH_DEFAULT_COUNT = 10
INTENTS_SEARCH_BY_FILTER_GRAMMAR_COUNT = 2
TWEETS_SEARCH_COUNT = 20
INTENTS_MIN_UNITS = 3
MAX_CLASSIFICATION_INTENTS = 3
Expand Down Expand Up @@ -463,6 +464,35 @@ var ES_INTENT_SUPPORTED_CONTENT_TYPES = map[string]bool{
CT_CLIP: true,
}

type IntentSearchOptions struct {
SearchTags bool
SearchSources bool
ContentTypes []string
}

var INTENT_OPTIONS_BY_GRAMMAR_CT_VARIABLES = map[string]IntentSearchOptions{
VAR_CT_PROGRAMS: IntentSearchOptions{
SearchSources: true,
SearchTags: true,
ContentTypes: []string{CT_VIDEO_PROGRAM_CHAPTER},
},
VAR_CT_ARTICLES: IntentSearchOptions{
SearchSources: true,
SearchTags: false,
ContentTypes: []string{CT_VIDEO_PROGRAM_CHAPTER, CT_LESSON_PART},
},
VAR_CT_LESSONS: IntentSearchOptions{
SearchSources: true,
SearchTags: true,
ContentTypes: []string{CT_LESSON_PART},
},
VAR_CT_BOOK_TITLES: IntentSearchOptions{
SearchSources: true,
SearchTags: false,
ContentTypes: []string{CT_VIDEO_PROGRAM_CHAPTER, CT_LESSON_PART},
},
}

// Fake index for intents.
var INTENT_INDEX_BY_TYPE = map[string]string{
INTENT_TYPE_TAG: INTENT_INDEX_TAG,
Expand Down
45 changes: 32 additions & 13 deletions search/engine.go
Original file line number Diff line number Diff line change
Expand Up @@ -606,7 +606,12 @@ func (e *ESEngine) timeTrack(start time.Time, operation string) {
func (e *ESEngine) DoSearch(ctx context.Context, query Query, sortBy string, from int, size int, preference string, checkTypo bool, timeoutForHighlight time.Duration) (*QueryResult, error) {
defer e.timeTrack(time.Now(), consts.LAT_DOSEARCH)

// Initializing all channels.
suggestChannel := make(chan null.String)
grammarsSingleHitIntentsChannel := make(chan []Intent, 1)
grammarsFilterIntentsChannel := make(chan []Intent, 1)
grammarsFilteredResultsByLangChannel := make(chan map[string]FilteredSearchResult)
tweetsByLangChannel := make(chan map[string]*elastic.SearchResult)

var resultTypes []string
if sortBy == consts.SORT_BY_NEWER_TO_OLDER || sortBy == consts.SORT_BY_OLDER_TO_NEWER {
Expand All @@ -621,28 +626,34 @@ func (e *ESEngine) DoSearch(ctx context.Context, query Query, sortBy string, fro
}

// Search grammars in parallel to native search.
grammarsChannel := make(chan []Intent)
grammarsFilteredResultsByLangChannel := make(chan map[string]FilteredSearchResult)

go func() {
defer func() {
if err := recover(); err != nil {
log.Errorf("ESEngine.DoSearch - Panic searching grammars: %+v", err)
grammarsChannel <- []Intent{}
grammarsSingleHitIntentsChannel <- []Intent{}
grammarsFilterIntentsChannel <- []Intent{}
grammarsFilteredResultsByLangChannel <- map[string]FilteredSearchResult{}
}
}()
if grammars, filtered, err := e.SearchGrammarsV2(&query, from, size, sortBy, resultTypes, preference); err != nil {
if singleHitIntents, filterIntents, err := e.SearchGrammarsV2(&query, from, size, sortBy, resultTypes, preference); err != nil {
log.Errorf("ESEngine.DoSearch - Error searching grammars: %+v", err)
grammarsChannel <- []Intent{}
grammarsSingleHitIntentsChannel <- []Intent{}
grammarsFilterIntentsChannel <- []Intent{}
grammarsFilteredResultsByLangChannel <- map[string]FilteredSearchResult{}
} else {
grammarsChannel <- grammars
grammarsFilteredResultsByLangChannel <- filtered
grammarsSingleHitIntentsChannel <- singleHitIntents
grammarsFilterIntentsChannel <- filterIntents
if filtered, err := e.SearchByFilterIntents(filterIntents, query.Term, from, size, sortBy, resultTypes, preference, query.Deb); err != nil {
log.Errorf("ESEngine.DoSearch - Error searching filtered results by grammars: %+v", err)
grammarsFilteredResultsByLangChannel <- map[string]FilteredSearchResult{}
} else {
grammarsFilteredResultsByLangChannel <- filtered
}
}
}()

// Search tweets in parallel to native search.
tweetsByLangChannel := make(chan map[string]*elastic.SearchResult)
go func() {
defer func() {
if err := recover(); err != nil {
Expand All @@ -658,6 +669,8 @@ func (e *ESEngine) DoSearch(ctx context.Context, query Query, sortBy string, fro
}
}()

filterIntents := <-grammarsFilterIntentsChannel

if checkTypo {
go func() {
defer func() {
Expand All @@ -666,7 +679,7 @@ func (e *ESEngine) DoSearch(ctx context.Context, query Query, sortBy string, fro
suggestChannel <- null.String{"", false}
}
}()
if suggestText, err := e.GetTypoSuggest(query); err != nil {
if suggestText, err := e.GetTypoSuggest(query, filterIntents); err != nil {
log.Errorf("ESEngine.GetTypoSuggest - Error getting typo suggest: %+v", err)
suggestChannel <- null.String{"", false}
} else {
Expand All @@ -675,7 +688,7 @@ func (e *ESEngine) DoSearch(ctx context.Context, query Query, sortBy string, fro
}()
}

intents, err := e.AddIntents(&query, preference, consts.INTENTS_SEARCH_COUNT, sortBy)
intents, err := e.AddIntents(&query, preference, sortBy, filterIntents)
if err != nil {
log.Errorf("ESEngine.DoSearch - Error adding intents: %+v", err)
}
Expand Down Expand Up @@ -763,7 +776,7 @@ func (e *ESEngine) DoSearch(ctx context.Context, query Query, sortBy string, fro
}

query.Intents = append(query.Intents, intents...)
query.Intents = append(query.Intents, <-grammarsChannel...)
query.Intents = append(query.Intents, <-grammarsSingleHitIntentsChannel...)

log.Debugf("Intents: %+v", query.Intents)

Expand Down Expand Up @@ -800,7 +813,7 @@ func (e *ESEngine) DoSearch(ctx context.Context, query Query, sortBy string, fro
if hit.Score != nil {
if _, hasId := filtered.HitIdsMap[hit.Id]; hasId {
log.Infof("Same hit found for both regular and grammar filtered results: %v", hit.Id)
if hit.Score != nil {
if hit.Score != nil && *hit.Score > 5 { // We will increment the score only if the result is relevant enough (score > 5)
*hit.Score += consts.FILTER_GRAMMAR_INCREMENT_FOR_MATCH_CT_AND_FULL_TERM
}
// We remove this hit id from HitIdsMap in order to highlight the original search term and not $Text val.
Expand All @@ -820,7 +833,13 @@ func (e *ESEngine) DoSearch(ctx context.Context, query Query, sortBy string, fro
}
}

boost := (*maxRegularScore * 0.9) / *filtered.MaxScore
boost := ((*maxRegularScore * 0.9) + 10) / *filtered.MaxScore
// Why we add +10 to the formula:
// In some cases we have several regular results with a very close scores that above 90% of the maxRegularScore.
// Since the top score for the best 'filter grammar' result is 90% of the maxRegularScore,
// we have cases where the best 'filter grammar' result will be below the high regular results with a VERY SMALL GAP between them.
// To minimize this gap, we add +10 the formula.
// e.g. search of term "ביטול קטעי מקור" without adding 10 bring the relevant result in position #4. With adding 10, the relevant result is the first.
for _, hit := range result.Hits.Hits {
if hit.Score != nil {
*hit.Score *= boost
Expand Down
129 changes: 75 additions & 54 deletions search/grammar_v2_engine.go
Original file line number Diff line number Diff line change
Expand Up @@ -147,22 +147,23 @@ func (e *ESEngine) suggestResultsToVariablesByPhrases(query *Query, result *elas
return ret, nil
}

func (e *ESEngine) SearchGrammarsV2(query *Query, from int, size int, sortBy string, resultTypes []string, preference string) ([]Intent, map[string]FilteredSearchResult, error) {
intents := []Intent{}
filtered := map[string]FilteredSearchResult{}
// Return: single hit intents, filtering intents
func (e *ESEngine) SearchGrammarsV2(query *Query, from int, size int, sortBy string, resultTypes []string, preference string) ([]Intent, []Intent, error) {
singleHitIntents := []Intent{}
filterIntents := []Intent{}
if query.Term != "" && len(query.ExactTerms) > 0 {
// Will never match any grammar for query having simple terms and exact terms.
// This is not acurate but an edge case. Need to better think of query representation.
log.Infof("Both term and exact terms are defined, should not trigger: [%s] [%s]", query.Term, strings.Join(query.ExactTerms, " - "))
return intents, filtered, nil
return singleHitIntents, filterIntents, nil
}
if e.cache != nil && e.cache.SearchStats().DoesSourceTitleWithMoreThanOneWordExist(query.Term) {
// Since some source titles contains grammar variable values,
// we are not triggering grammar search if the term eqauls to a title of a source.
// Some examples for such source titles:
// 'Book, Author, Story','Connecting to the Source', 'Introduction to articles', 'שיעור ההתגברות', 'ספר הזוהר'
log.Infof("The term is identical to a title of a source, should not trigger: [%s]", query.Term)
return intents, filtered, nil
return singleHitIntents, filterIntents, nil
}

multiSearchService := e.esc.MultiSearch()
Expand All @@ -189,56 +190,16 @@ func (e *ESEngine) SearchGrammarsV2(query *Query, from int, size int, sortBy str
return nil, nil, errors.New(fmt.Sprintf("Failed multi get: %+v", currentResults.Error))
}
language := query.LanguageOrder[i/2]
filterSearchRequests := []*elastic.SearchRequest{}
if haveHits(currentResults) {
if singleHitIntents, filterIntents, err := e.searchResultsToIntents(query, language, currentResults); err != nil {
if languageSingleHitIntents, languageFilterIntents, err := e.searchResultsToIntents(query, language, currentResults); err != nil {
return nil, nil, err
} else {
intents = append(intents, singleHitIntents...)
if filterIntents != nil && len(filterIntents) > 0 {
for _, filterIntent := range filterIntents {
// Currently we support "filter grammar" with only one appereance of each variable.
// This may be changed in the future.
if intentValue, ok := filterIntent.Value.(GrammarIntent); ok {
var contentType string
var text string
for _, fv := range intentValue.FilterValues {
if fv.Name == consts.VARIABLE_TO_FILTER[consts.VAR_CONTENT_TYPE] {
contentType = fv.Value
} else if fv.Name == consts.VARIABLE_TO_FILTER[consts.VAR_TEXT] {
text = fv.Value
}
if contentType != "" && text != "" {
break
}
}
if contentType != "" && text != "" {
log.Infof("Filtered Search Request: ContentType is %s, Text is %s.", contentType, text)
textValSearchRequests, err := NewFilteredResultsSearchRequest(text, contentType, from, size, sortBy, resultTypes, language, preference, query.Deb)
if err != nil {
return nil, nil, err
}
fullTermSearchRequests, err := NewFilteredResultsSearchRequest(query.Term, contentType, from, size, sortBy, resultTypes, language, preference, query.Deb)
if err != nil {
return nil, nil, err
}
filterSearchRequests = append(textValSearchRequests, fullTermSearchRequests...)
if len(filterSearchRequests) > 0 {
// All search requests here are for the same language
results, hitIdsMap, maxScore, err := e.filterSearch(filterSearchRequests)
if err != nil {
return nil, nil, err
}
filtered[language] = FilteredSearchResult{
Results: results,
Term: text,
ContentType: contentType,
HitIdsMap: hitIdsMap,
MaxScore: maxScore,
}
}
}
}
singleHitIntents = append(singleHitIntents, languageSingleHitIntents...)
if languageFilterIntents != nil {
if len(languageFilterIntents) > 1 {
return singleHitIntents, nil, errors.Errorf("Number of filter intents for language '%v' is %v but only 1 filter intent is currently supported.", language, len(filterIntents))
} else if len(languageFilterIntents) == 1 {
filterIntents = append(filterIntents, languageFilterIntents...)
}
}
}
Expand All @@ -248,7 +209,67 @@ func (e *ESEngine) SearchGrammarsV2(query *Query, from int, size int, sortBy str
if elapsed > 10*time.Millisecond {
fmt.Printf("build grammar intent - %s\n\n", elapsed.String())
}
return intents, filtered, nil
return singleHitIntents, filterIntents, nil
}

// Search according to grammar based filter (currently by content types and free text).
func (e *ESEngine) SearchByFilterIntents(filterIntents []Intent, originalSearchTerm string, from int, size int, sortBy string, resultTypes []string, preference string, deb bool) (map[string]FilteredSearchResult, error) {
resultsByLang := map[string]FilteredSearchResult{}
for _, intent := range filterIntents {
if intentValue, ok := filterIntents[0].Value.(GrammarIntent); ok {
var contentType string
var text string
for _, fv := range intentValue.FilterValues {
if fv.Name == consts.VARIABLE_TO_FILTER[consts.VAR_CONTENT_TYPE] {
contentType = fv.Value
} else if fv.Name == consts.VARIABLE_TO_FILTER[consts.VAR_TEXT] {
text = fv.Value
}
if contentType != "" && text != "" {
// Currently we support "filter grammar" with only one appereance of each variable.
// This may be changed in the future.
break
}
}
if contentType != "" && text != "" {
log.Infof("Filtered Search Request: ContentType is %s, Text is %s.", contentType, text)
requests := []*elastic.SearchRequest{}
textValSearchRequests, err := NewFilteredResultsSearchRequest(text, contentType, from, size, sortBy, resultTypes, intent.Language, preference, deb)
if err != nil {
return nil, err
}
requests = append(requests, textValSearchRequests...)
if contentType != consts.VAR_CT_ARTICLES {
fullTermSearchRequests, err := NewFilteredResultsSearchRequest(originalSearchTerm, contentType, from, size, sortBy, resultTypes, intent.Language, preference, deb)
if err != nil {
return nil, err
}
requests = append(requests, fullTermSearchRequests...)
}
if len(requests) > 0 {
// All search requests here are for the same language
results, hitIdsMap, maxScore, err := e.filterSearch(requests)
if err != nil {
return nil, err
}
resultsByLang[intent.Language] = FilteredSearchResult{
Results: results,
Term: text,
ContentType: contentType,
HitIdsMap: hitIdsMap,
MaxScore: maxScore,
}
if len(results) > 0 {
// we assume that there is no need to make the search for other languages if a results found for one language
break
}
}
}
} else {
return nil, errors.Errorf("FilterSearch error. Intent is not GrammarIntent. Intent: %+v", intent)
}
}
return resultsByLang, nil
}

func (e *ESEngine) VariableMapToFilterValues(vMap map[string][]string, language string) []FilterValue {
Expand Down Expand Up @@ -522,7 +543,7 @@ func retrieveTextVarValues(str string) []string {
return textVarValues
}

// Results search according to grammar based filter (currently by content types).
// Results search according to grammar based filter (currently by content types and free text).
// Return: Results, Unique list of hit id's as a map, Max score
func (e *ESEngine) filterSearch(requests []*elastic.SearchRequest) ([]*elastic.SearchResult, map[string]bool, *float64, error) {
results := []*elastic.SearchResult{}
Expand Down
4 changes: 4 additions & 0 deletions search/grammar_variables_matcher.go
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,10 @@ func GrammarVariablesMatch(intent string, vMap map[string][]string, cm cache.Cac
hasVarContentType = true
}
}
if !(hasVarText && hasVarContentType) {
log.Warningf("Filter intent must have one appearance of $Text and one appearance of $ContentType")
return false
}
return true
} else if intent == consts.GRAMMAR_INTENT_LANDING_PAGE_CONVENTIONS {
location := ""
Expand Down
1 change: 1 addition & 0 deletions search/grammars/landing-pages.grammar
Original file line number Diff line number Diff line change
Expand Up @@ -230,6 +230,7 @@ he,lesson_series => סדרות לימוד נבחרות
he,lesson_series => סדרות לימוד
he,lessons => הכנה לשיעור
he,lessons => שיעור בוקר
he,lessons => שיעור הקבלה היומי
he,lessons => שיעורים
he,lessons => שיעורים יומיים
he,lessons => שרטוטים
Expand Down

0 comments on commit 988cbbd

Please sign in to comment.