-
Notifications
You must be signed in to change notification settings - Fork 2.7k
How to use StanfordCoreNLPServer to load scenegraph library #1346
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Comments
It's the same problem as on the stanza side. The code to run scenegraph in a server simply doesn't exist. I'm sure you've found this already, but there are instructions on how to run it here: https://nlp.stanford.edu/software/scenegraph-parser.shtml That's the only thing that exists currently. We could theoretically include it in a server, but until you came along, there was never any request for that. What are you looking for that the CLI and the API don't provide? |
I wonder if you can include the following two APIs in NLPserver then I use the serve to transfer a sentence into scene graph in Python, just like this `` sentence = "A brown fox chases a white rabbit." parser = CoreNLPParser(url='http://localhost:9000') print(sg.toReadableString()) It will be helpful for us to cite your work in https://nlp.stanford.edu/software/scenegraph-parser.shtml |
Sorry for misunderstaing, I just wanna to parse a sentence into a scene graph with Python @AngledLuffa |
can do, but it will be at least a week until we make that available |
thanks for your help, looking forward to your reply @AngledLuffa |
Sorry for the long delay. Spring quarter was quite busy (we did submit a couple drafts out of work we had done, so that was good). I have time to pick this up again, and after looking over scenegraph, I think I can put something together.
Which of these would be more useful, assuming you even still need this upgrade? |
Processes requests using the scenegraph package: https://nlp.stanford.edu/software/scenegraph-parser.shtml Output is in either the text or json format from the scenegraph package. Requested in #1346
…ace which will be released in the next version of CoreNLP. stanfordnlp/CoreNLP#1346
Thanks for your reply, I wonder if you can include the scene graph parser into stanza? so that I can directly use it after intalling stanza? |
One way or another, it'll require having the Java implementation of SceneGraph, as there really isn't any impetus to update the project for python & deep learning in the group |
can u implement like this, maybe the second way? |
You can see what I did in the two changes listed above. If you have a CoreNLP server created, or if you create one using the Stanza API, you can then call It occurs to me that I don't actually know if the SceneGraph parser is threadsafe. 🤷 |
It might be a little time before the next official release, but I can make some sort of temporary package available if you'd use it in the short term |
Thanks, I will have a try in the following week |
Processes requests using the scenegraph package: https://nlp.stanford.edu/software/scenegraph-parser.shtml Output is in either the text or json format from the scenegraph package. Requested in #1346 Leave a note about not having tested the scenegraph parser for thread safety
…ace which will be released in the next version of CoreNLP. stanfordnlp/CoreNLP#1346
I haven't made an official CoreNLP release with the updates yet, but you can use this for the server changes: https://nlp.stanford.edu/software/stanford-corenlp-4.5.4.zip For the Stanza interface, you'll have to install the dev branch, such as described here https://stackoverflow.com/questions/20101834/pip-install-from-git-repo-branch |
Processes requests using the scenegraph package: https://nlp.stanford.edu/software/scenegraph-parser.shtml Output is in either the text or json format from the scenegraph package. Requested in #1346 Leave a note about not having tested the scenegraph parser for thread safety
sorry, I still have trouble in building client with Stanza API set up the clientwith CoreNLPClient(annotators=['tokenize','ssplit','pos','lemma','ner','parse','depparse','coref'], timeout=60000, memory='16G') as client: did I miss something? |
from stanza.server import CoreNLPClient print('---') text = "Chris Manning is a nice person. Chris wrote a simple sentence. He also gives oranges to people." print(text) print('---') with CoreNLPClient(timeout=60000, memory='16G') as client: |
bug here: input text
|
@AngledLuffa can u provide your demo code to use client.secengraph, thanks |
what failure are you getting? this works for me with the latest CoreNLP (4.5.5) and the latest stanza (1.5.1) |
getting a connection error makes me wonder if there's a permissions problem binding that port |
but the demo for CoreNLP works for me from stanza.server import CoreNLPClient print('---') text = "Chris Manning is a nice person. Chris wrote a simple sentence. He also gives oranges to people." print(text) print('---') with CoreNLPClient(annotators=['tokenize','ssplit','pos','lemma','ner','parse','depparse','coref'], timeout=60000, memory='16G') as client: |
I have found that it is behaving a bit different between having the server already started and starting the server inside a script. Let me track that down |
this works for me after debugging, but I donot know why @AngledLuffa from stanza.server import CoreNLPClient text = "Chris Manning is a nice person. Chris wrote a simple sentence. He also gives oranges to people." with CoreNLPClient(annotators=['tokenize'], timeout=60000, memory='16G') as client:
|
seems there are some procedures to intial the client in client.annotate() |
ok, figured it out, there needs to be a
It was supposed to be in the client itself, so that's on me. I'll fix it on the dev branch. For now you can just add this line, so the code to start & then run it looks like this
|
@AngledLuffa great! works for me now, thanks for your help! |
👍 i will add that to the client so you won't need to do it for future versions |
@AngledLuffa got it! |
…ise, a request attempt might happen while the server is still loading and therefore fail. See discussion at stanfordnlp/CoreNLP#1346
I wanna to construct a StanfordCoreNLPServer, which has preloaded scenegraph, how to achice this?
I have tried the command "
java -mx4g -cp "*" edu.stanford.nlp.pipeline.StanfordCoreNLPServer
-preload scenegraph
-status_port 9000 -port 9000 -timeout 15000 &
"
but it seems to be failed
The text was updated successfully, but these errors were encountered: