-
Notifications
You must be signed in to change notification settings - Fork 16
Get Perseus definitions #26
Comments
@kylepjohnson I would like to implement the definition endpoint. How to get started on this one ? |
I'm assuming the json format to be |
Great--looks good. Here is a proposed data format I have been working with for the definitions so far: https://github.com/cltk/cltk_frontend/blob/master/client/views/reading/DefinitionsPanel.jsx#L18 |
Thanks for the link. I'll post a more detailed overview, considering your implementation on the front end. |
@lukehollis Can you please give me the link to definitions in the corpus ? Thanks! |
We will need to look these up programmatically based on an input string. I'm indifferent on whether that input string should be a single word or a sentence/line of poetry. |
Here's an example query for the first word of the Aeneid: http://www.perseus.tufts.edu/hopper/morph?l=Arma&la=la |
@lukehollis @kylepjohnson If I am not wrong, whenever a user queries for a definition, we want to extract data from the following source: http://www.perseus.tufts.edu/hopper/morph?l=Arma&la=la
Give me a heads up and I'll implement this feature right on. |
That's a good question, @manu-chroma--and thanks for tracking down the xmlmorph version. I can see both sides and don't have a very strong opinion one way or the other, but I think that since this is specifically an issue dealing with hitting an external web API, it should exist in the cltk_api instead of core. @kylepjohnson will have a better sense of the direction for the core package though. |
@lukehollis I get your point. By the way, the xmlmorph version looks incomplete to me. It doesn't contain the definition. I think scraping is the way to go. What do you think ? |
Hey guys, An alternative to using the Perseus API would be to use the same lexicon they're using. Somehow accidentally deleted the relevant files for Latin (needs to be re-added to cltk/latin_lexica_perseus), here's Greek:
The Latin works the same. @lukehollis does the Perseus API offer more than what's in Note: I'm going to open a new ticket that the Latin lemmata files get re-added to |
Referencing Issue #30. @manu-chroma would you like to do this? It's related and I bet will answer your question about whether to use the API or parse the files ourselves. |
Not that I know of, and if we can use our own servers it seems like it'd be nicer to their systems not to be hitting their API all the time (not sure of the usage restraints there). Working from the cltk lexica repos here seems like a better solution if possible. All for it. 👍 |
Hey @manu-chroma, any updates here? Would it be possible to push any progress that you've made? |
Hey @lukehollis, I've mailed you regarding the same. Please check. |
Thanks so much, @manu-chroma--and good luck with the switch! |
Cross-referencing Issue cltk/cltk_frontend#60 by @lukehollis.
The text was updated successfully, but these errors were encountered: