-
Notifications
You must be signed in to change notification settings - Fork 224
bug fix #269
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
bug fix #269
Conversation
commit of Fetch Surrounding Chunks python notebook
added !pip install pandas
fixed issue during checks. installed google.colab
updated notebook to use api key instead of username and password similar to notebook here: https://colab.research.google.com/github/elastic/elasticsearch-labs/blob/main/notebooks/search/00-quick-start.ipynb#scrollTo=f38e0397
Updated notebook to handle downloading required models such as elser and sentence transformer minilm
var chapter_number was not initialized. Fixed.
bug chapter_number = None. forgot = sign
added es_model_id
remove es_model_id as it is not needed.
dense_embedding_model_id was missing from query. renamed.
for debugging changed max_chapter_chunk_result
added error handling
added open in colab
Found 1 changed notebook. Review the changes at https://gitnotebooks.com/elastic/elasticsearch-labs/pull/269 |
@@ -39,13 +39,13 @@ | |||
}, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Commented on notebook supporting-blog-content/fetch-surrounding-chunks/fetch-surrounding-chunks.ipynb
Cell 5 Line 32
# Create the client instance
esclient = Elasticsearch(
cloud_id=ELASTIC_CLOUD_ID,
api_key=ELASTIC_API_KEY,
)
print(esclient.info())
contains UserWarning. suggest not commit this.
Respond and view the context here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@joemcelroy thank you. Issue resolved and pushed. review requested.
During @consulthys review of the nookbook, he found dup chunks were being printed out. fixed.