You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Error is:
Uncaught TypeError: nodejieba.cut is not a function
at lunr.zh.tokenizer (lunr.zh.js:98:1)
at lunr.Builder.add (lunr.js:2479:1)
at lunr.Builder. (xxx)
at lunr (lunr.js:53:1)
at XMLHttpRequest. (xxx)
I'm afraid I'm not quite good enough at this time to dive in and resolve, but if someone could assist in reviewing or letting me know what exactly I would need to do to handle, I would much appreciate...
The text was updated successfully, but these errors were encountered:
Are you trying to run this in a browser environment? The zh tokenizer requires node to run, because it uses C++ addons (node jieba). I opened an issue (#90) where I talk about how you can use the built-in Intl.Segmenter instead to segment Chinese (and other) languages quite easily. Here is a fork where I switched the zh module to using Intl.Segmenter.
Error is:
Uncaught TypeError: nodejieba.cut is not a function
at lunr.zh.tokenizer (lunr.zh.js:98:1)
at lunr.Builder.add (lunr.js:2479:1)
at lunr.Builder. (xxx)
at lunr (lunr.js:53:1)
at XMLHttpRequest. (xxx)
The line in the lunr.zh.tokenizer is:
I'm afraid I'm not quite good enough at this time to dive in and resolve, but if someone could assist in reviewing or letting me know what exactly I would need to do to handle, I would much appreciate...
The text was updated successfully, but these errors were encountered: