This blog shows how a website search could be implemented using ASP.NET Core, and Elasticsearch. Most users expect autocomplete and a flexible search like some of the known search websites. When the user enters a char in the search input field, an autocomplete using a shingle token filter with a terms aggregation is used to suggest possible search terms.
In this example, 2 custom analyzers are defined, one for the autocomplete and one for the search. The autocomplete analyzer uses a custom shingle token filter called autocomplete filter, a stopwords token filter, lowercase token filter, and a stemmer token filter.
private IndexDefinition CreateNewIndexDefinition() { return new IndexDefinition { IndexSettings = { Analysis = new Analysis { Filters = { CustomFilters = new List<AnalysisFilterBase> { new StemmerTokenFilter("stemmer"), new ShingleTokenFilter("autocompletefilter") { MaxShingleSize = 5, MinShingleSize = 2 }, new StopTokenFilter("stopwords"), new EdgeNGramTokenFilter("edge_ngram_filter") { MaxGram = 20, MinGram = 2 } } }, Analyzer = { Analyzers = new List<AnalyzerBase> { new CustomAnalyzer("edge_ngram_search") { Tokenizer = DefaultTokenizers.Standard, Filter = new List<string> {DefaultTokenFilters.Lowercase, "edge_ngram_filter"}, CharFilter = new List<string> {DefaultCharFilters.HtmlStrip} }, new CustomAnalyzer("autocomplete") { Tokenizer = DefaultTokenizers.Standard, Filter = new List<string> {DefaultTokenFilters.Lowercase, "autocompletefilter", "stopwords", "stemmer"}, CharFilter = new List<string> {DefaultCharFilters.HtmlStrip} }, new CustomAnalyzer("default") { Tokenizer = DefaultTokenizers.Standard, Filter = new List<string> {DefaultTokenFilters.Lowercase, "stopwords", "stemmer"}, CharFilter = new List<string> {DefaultCharFilters.HtmlStrip} } } } } }, }; }
The PersonCity is used to add and search for documents in Elasticsearch. The default index and type for this class using ElasticsearchCrud are personcitys and personcity.
public class PersonCity { public long Id { get; set; } public string Name { get; set; } public string FamilyName { get; set; } public string Info { get; set; } public string CityCountry { get; set; } public string Metadata { get; set; } public string Web { get; set; } public string Github { get; set; } public string Twitter { get; set; } public string Mvp { get; set; } }
A PersonCityMapping class is defined so that required mapping from the PersonCityMappingDto mapping class can be defined for the personcitys index and the personcity type. This class overrides the ElasticsearchMapping to define the index and type.
using System; using ElasticsearchCRUD; namespace SearchComponent { public class PersonCityMapping : ElasticsearchMapping { public override string GetIndexForType(Type type) { return "personcitys"; } public override string GetDocumentType(Type type) { return "personcity"; } } }
The PersonCityMapping class is then used to map the C# type PersonCityMappingDto to the default index from the PersonCity class using the PersonCityMapping. The PersonCityMapping maps to the default index of the PersonCity class.
public PersonCitySearchProvider() { _elasticsearchMappingResolver.AddElasticSearchMappingForEntityType(typeof(PersonCityMappingDto), new PersonCityMapping()); _context = new ElasticsearchContext(ConnectionString, new ElasticsearchSerializerConfiguration(_elasticsearchMappingResolver)) { TraceProvider = new ConsoleTraceProvider() }; }
A specific mapping DTO class is used to define the mapping in Elasticsearch. This class is required if a non-default mapping is required in Elasticsearch. The class uses the ElasticsearchString attribute to define a copy mapping. The field in Elasticsearch should be copied to the autocomplete and the search field when adding a new document. The search field and the autocomplete field use the two analyzers which were defined in the index when adding data. This class is only used to define the type mapping in Elasticsearch.
using ElasticsearchCRUD.ContextAddDeleteUpdate.CoreTypeAttributes; namespace SearchComponent { public class PersonCityMappingDto { public long Id { get; set; } [ElasticsearchString(CopyToList = new[] { "autocomplete", "searchfield" })] public string Name { get; set; } [ElasticsearchString(CopyToList = new[] { "autocomplete", "searchfield" })] public string FamilyName { get; set; } [ElasticsearchString(CopyToList = new[] { "autocomplete", "searchfield" })] public string Info { get; set; } [ElasticsearchString(CopyToList = new[] { "autocomplete", "searchfield" })] public string CityCountry { get; set; } [ElasticsearchString(CopyToList = new[] { "autocomplete", "searchfield" })] public string Metadata { get; set; } public string Web { get; set; } public string Github { get; set; } public string Twitter { get; set; } public string Mvp { get; set; } [ElasticsearchString(Analyzer = "edge_ngram_search", SearchAnalyzer = "standard", TermVector = TermVector.yes)] public string searchfield { get; set; } [ElasticsearchString(Analyzer = "autocomplete")] public string autocomplete { get; set; } } }
The IndexCreate method creates a new index and mapping in elasticsearch.
public void CreateIndex() { _context.IndexCreate<PersonCityMappingDto>(CreateNewIndexDefinition()); }
The Elasticsearch settings can be viewed using the HTTP GET: http://localhost:9200/_settings
{ "personcitys": { "settings": { "index": { "creation_date": "1477642409728", "analysis": { "filter": { "stemmer": { "type": "stemmer" }, "autocompletefilter": { "max_shingle_size": "5", "min_shingle_size": "2", "type": "shingle" }, "stopwords": { "type": "stop" }, "edge_ngram_filter": { "type": "edgeNGram", "min_gram": "2", "max_gram": "20" } }, "analyzer": { "edge_ngram_search": { "filter": ["lowercase", "edge_ngram_filter"], "char_filter": ["html_strip"], "type": "custom", "tokenizer": "standard" }, "autocomplete": { "filter": ["lowercase", "autocompletefilter", "stopwords", "stemmer"], "char_filter": ["html_strip"], "type": "custom", "tokenizer": "standard" }, "default": { "filter": ["lowercase", "stopwords", "stemmer"], "char_filter": ["html_strip"], "type": "custom", "tokenizer": "standard" } } }, "number_of_shards": "5", "number_of_replicas": "1", "uuid": "TxS9hdy7SmGPr4FSSNaPiQ", "version": { "created": "2040199" } } } } }
The Elasticsearch mapping can be viewed using the HTTP GET: http://localhost:9200/_mapping
{ "personcitys": { "mappings": { "personcity": { "properties": { "autocomplete": { "type": "string", "analyzer": "autocomplete" }, "citycountry": { "type": "string", "copy_to": ["autocomplete", "searchfield"] }, "familyname": { "type": "string", "copy_to": ["autocomplete", "searchfield"] }, "github": { "type": "string" }, "id": { "type": "long" }, "info": { "type": "string", "copy_to": ["autocomplete", "searchfield"] }, "metadata": { "type": "string", "copy_to": ["autocomplete", "searchfield"] }, "mvp": { "type": "string" }, "name": { "type": "string", "copy_to": ["autocomplete", "searchfield"] }, "searchfield": { "type": "string", "term_vector": "yes", "analyzer": "edge_ngram_search", "search_analyzer": "standard" }, "twitter": { "type": "string" }, "web": { "type": "string" } } } } } }
Now documents can be added using the PersonCity class which has no Elasticsearch definitions.
A terms aggregation search is used for the autocomplete request. The terms aggregation uses the autocomplete field which only exists in Elasticsearch. A list of strings is returned to the user from this request.
public IEnumerable<string> AutocompleteSearch(string term) { var search = new Search { Size = 0, Aggs = new List<IAggs> { new TermsBucketAggregation("autocomplete", "autocomplete") { Order= new OrderAgg("_count", OrderEnum.desc), Include = new IncludeExpression(term + ".*") } } }; var items = _context.Search<PersonCity>(search); var aggResult = items.PayloadResult.Aggregations.GetComplexValue<TermsBucketAggregationsResult>("autocomplete"); IEnumerable<string> results = aggResult.Buckets.Select(t => t.Key.ToString()); return results; }
The request is sent to Elasticsearch as follows:
POST http://localhost:9200/personcitys/personcity/_search HTTP/1.1 Content-Type: application/json Accept-Encoding: gzip, deflate Connection: Keep-Alive Content-Length: 124 Host: localhost:9200 { "size": 0, "aggs": { "autocomplete": { "terms": { "field": "autocomplete", "order": { "_count": "desc" }, "include": { "pattern": "as.*" } } } } }
When an autocomplete string is selected, a search request is sent to Elasticsearch using a Match Query on the search field which returns 10 hits from the 0 documents. If the paging request is sent, the from value is a multiple of 10 depending on the page.
public PersonCitySearchResult Search(string term, int from) { var personCitySearchResult = new PersonCitySearchResult(); var search = new Search { Size = 10, From = from, Query = new Query(new MatchQuery("did_you_mean", term)) }; var results = _context.Search<PersonCity>(search); personCitySearchResult.PersonCities = results.PayloadResult.Hits.HitsResult.Select(t => t.Source); personCitySearchResult.Hits = results.PayloadResult.Hits.Total; personCitySearchResult.Took = results.PayloadResult.Took; return personCitySearchResult; }
The search query as sent as follows:
POST http://localhost:9200/personcitys/personcity/_search HTTP/1.1 Content-Type: application/json Accept-Encoding: gzip, deflate Connection: Keep-Alive Content-Length: 74 Host: localhost:9200 { "from": 0, "size": 10, "query": { "match": { "searchfield": { "query": "asp.net" } } } }
You Can Download the source code for this post. GitHub
I hope you guys understand how I can do this. Let me know if you face any difficulties.
You can watch my previous blog here.
Happy Coding {;} ????
In this article, we have to show Create and Used PIPE in angular
In this article, we have to show Create and Used PIPE in angular
In this article, we have to show Create and Used PIPE in angular