-
Notifications
You must be signed in to change notification settings - Fork 127
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SPARQL API inserted DATA cannot be modified with web interface #3900
Comments
Hi, are you inserting via the SPARQL API into the named graph called http://vitro.mannlib.cornell.edu/default/vitro-kb-2 ? If not, you should try using that graph if you want the ingested data to be editable; that's the only graph the UI editing interface can add/remove from. It indeed can be confusing if you add triples to other graphs because they will still be visible in the GUI editor, but the editor won't be able to delete the old triples involved in an edit. In the future it would be a nice improvement to make the editor either work across all named graphs or at least only offer edit controls on the data in the kb-2 graph. |
Yes, the wiki page should be updated to include this caveat. What you'll need is to use the exact same graph for all the inserts, i.e. INSERT DATA { GRAPH http://vitro.mannlib.cornell.edu/default/vitro-kb-2 { ... } }, not just the same graph namespace. If you need to keep things in separate graphs but still let the GUI editor delete from all of them, you may be able to make some relatively small modifications to the code to support what you need. On one recent project I changed the editor to delete from any graph, not just kb-2. Any new triples the GUI editor added during an edit still got added added to the single kb-2 graph (which in this case is what we wanted; the graph reflects the provenance of the triples). A more complicated change would be to have the editor put any new triples matching, for example, pattern ?g { ?s ?p ?y } back into the same graph as the removed ?g { ?s ?p ?x }, but it could be done. |
Yes , i'll just add triplets in one graph no problem for me, i'll use existing way to generate the URI to the triplets. Just i wont need to generate individual GRAPHS for everything. Just a little code change in the database no issues. |
That looks good. I understand the frustration. I created an issue for updating the wiki; hopefully we can make it clearer going forward. #3901 |
Thank you so much I will post here if it works. I just need to redesign our university alliance MySQL DB that we will use to translate data between our systems and vivo in near real-time :) Again thank you so much, now I can complete my job to the level it will be really well done, without issues :) |
Hi , reporting back. Thank you so much :) |
Very glad to hear it. You are very welcome, and thanks for letting us know the outcome. |
I am successful in adding lots of data into VIVO trough SPARQL API. But when i add data with curl post to the API, the data is showing and when you try to delete if from the interface it doesn't work.
Same goes for redacting, is just adds another label for the data with redacted data. the older data is intact.
What is did:
I can successfully insert/update/Delete the data trough SPARQL API when the data is changed in the internal MySQL database that generates the SPARQL query batches for execution on the API endpoint.
Imported data trough SPARQL API - success.
1.1. Tried to delete /edit data trough interface FAIL . The data start to be doubled tripled is i edit the item that i inserted with SPARQL
1.2. Exported the A-box of the application and tried to erase data via ADD/REMOVE RDF - doesn't work.
1.3. Successfully removed data trough SPARQL batch.
Imported the data that i exported in 1.2
2.1. I can edit the data/delete data trough interface
2.2. Tried to delete the data with the same batch that uploaded it initially - DOEN'T WORK
2.3. Removed data trough ADD/REMOVE RDF - WORKS
For me this is caused by different encodings when parsing data trough API.
**my SPARQL data
dfeal 1688362755-insert_batch-1-sparql.querybatch
{
"encoding": "UTF-8",
"language": "english",
"confidence": {
"encoding": 1,
"language": 0.01
}
}
file -i 1688362755-insert_batch-1-sparql.querybatch
1688362755-insert_batch-1-sparql.querybatch: text/plain; charset=utf-8**
When I export the data that I imported to for example TTL
I'm not in norwegia !!!
**dfeal localtes_erua-eui.eu-rdf_export-2023-07-03.ttl
{
"encoding": "UTF-8",
"language": "norwegian",
"confidence": {
"encoding": 1,
"language": 0.12
}
}
file -i localtes_erua-eui.eu-rdf_export-2023-07-03.ttl
localtes_erua-eui.eu-rdf_export-2023-07-03.ttl: text/plain; charset=utf-8**
My curl request
output=$(curl $sparqlapi -o /dev/null -w %{http_code} -s -H "Content-Type=text/plain; charset=utf-8;" -d "email="$vivouser"" -d "password="$vivopass"" -d "@"$path$file"")
For me this is caused by java loading data from the database then saving it with different encoding...
Initially SPARQL API is working with UTF-8 (documentation on the wiki page says so), but I think that CURL post body is not processed with UTF-8, by default JAVA processes the POST body with Latin1 _default ..... which is CP1522 , Norwegian.... converted to UTF-8
Any idea how to fix this. Thanks in advance
The text was updated successfully, but these errors were encountered: