Skip to content

Commit e0205c2

Browse files
Update README.md
1 parent ea45b49 commit e0205c2

File tree

1 file changed

+13
-3
lines changed

1 file changed

+13
-3
lines changed

README.md

Lines changed: 13 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -36,14 +36,24 @@ CI->>RE: Zip and Publish
3636
Go to this github [release page](https://github.com/clj-codes/docs.extractor/releases), download and unzip the `docs-db.zip` file.
3737

3838
## Connecting
39-
Since `v.0.2.0`, because of the new full-text index analyzers, this database requires this minimal connection opts to be used:
39+
Since `v.0.3.0`, because of the new full-text index analyzers, this database requires this minimal connection opts to be used:
4040
```clojure
4141
(require '[datalevin.core :as d]
42-
'[datalevin.search-utils :as su])
42+
'[datalevin.search-utils :as su]
43+
'[datalevin.interpret :refer [inter-fn]])
44+
45+
(defn merge-tokenizers
46+
"Merges the results of tokenizer a and b into one sequence."
47+
[tokenizer-a tokenizer-b]
48+
(inter-fn [^String s]
49+
(into (sequence (tokenizer-a s))
50+
(sequence (tokenizer-b s)))))
4351

4452
(def conn-opts
4553
(let [query-analyzer (su/create-analyzer
46-
{:tokenizer (su/create-regexp-tokenizer #"[\s:/\.;,!=?\"'()\[\]{}|<>&@#^*\\~`\-]+")
54+
{:tokenizer (merge-tokenizers
55+
(inter-fn [s] [[s 0 0]])
56+
(su/create-regexp-tokenizer #"[\s:/\.;,!=?\"'()\[\]{}|<>&@#^*\\~`\-]+"))
4757
:token-filters [su/lower-case-token-filter]})]
4858
{:search-domains {"project-name" {:query-analyzer query-analyzer}
4959
"namespace-name" {:query-analyzer query-analyzer}

0 commit comments

Comments
 (0)