Skip to content

apano-on/test-spark-datatypes

 
 

Repository files navigation

SparkSQL --> XML Schema datatypes conversion in OntopSpark


SparkSQL reference : https://spark.apache.org/docs/latest/sql-ref-datatypes.html

W3C reccomended mappings : https://www.w3.org/2001/sw/rdb2rdf/wiki/Mapping_SQL_datatypes_to_XML_Schema_datatypes


Spark datatype SparkSQL name W3C reccomandation ONTOP default OntopSpark
BooleanType BOOLEAN xsd:boolean xsd:boolean
ByteType BYTE, TINYINT xsd:integer or subtype xsd:integer xsd:byte
ShortType SHORT, SMALLINT xsd:integer or subtype xsd:integer xsd:short
IntegerType INT, INTEGER xsd:integer or subtype xsd:integer xsd:int
LongType LONG, BIGINT xsd:integer or subtype xsd:integer xsd:long
FloatType FLOAT, REAL xsd:float or xsd:double xsd:double xsd:float
DoubleType DOUBLE xsd:double xsd:double
DateType DATE xsd:date xsd:date
TimestampType TIMESTAMP xsd:dateTime xsd:dateTime
StringType STRING xsd:string xsd:string
BinaryType BINARY xsd:hexBinary or xsd:base64Binary xsd:hexBinary
DecimalType DECIMAL, DEC, NUMERIC xsd:decimal xsd:decimal

Running the test

  1. Build and run Apache Spark
foo@bar:~$ sudo ./docker-build.sh
foo@bar:~$ sudo docker-compose -f docker-compose-spark.yml up
  1. Wait Apache Spark complete startup, then run OntopSpark in another console window
foo@bar:~$ sudo docker-compose -f docker-compose-ontop.yml up
  1. Connect to localhost:8080

  2. Execute the query, and check the datatypes

PREFIX : <http://www.semanticweb.org/spark-datatypes-test#>

SELECT * WHERE {
  ?sub ?pred ?obj .
}

About

Test OntopSpark's OBDA datatype conversions to be compliant with W3C standards

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Shell 51.2%
  • Dockerfile 48.8%