OHDSI Home | Forums | Wiki | Github

WebApi - Spark Databricks configuration driver

Hello, We are performing our first installation of WebAPI and Atlas against our CDM located on Spark Databricks. Our source configuration is jdbc:spark://xxxxxxx.azuredatabricks.net:443/default;uid=token;PWD=xxxxxxxx;transportMode=http;ssl=1;httpPath=sql/protocolv1/o/8258457304257519/0122-121530-8os6ljem;AuthMech=3;useNativeQuery=1;
and the driver used and copied into WEB-INF/lib is SparkJDBC42.jar. Everything seems to be working correctly, connecting to the database and displaying results. However, the application log is constantly showing the following error:

java.sql.SQLException: No suitable driver found for  jdbc:spark://aXXXXXXXXXX.azuredatabricks.net:443/default;uid=token;PWD=XXXXXXXXXXXXXXXXX;transportMode=http;ssl=1;httpPath=sql/protocolv1/o/8258457304257519/0122-121530-8os6ljem;AuthMech=3;useNativeQuery=1;
	at java.sql.DriverManager.getConnection(Unknown Source)
	at java.sql.DriverManager.getConnection(Unknown Source)
	at org.ohdsi.sql.BigQuerySparkTranslate.sparkHandleInsert(BigQuerySparkTranslate.java:508)
	at org.ohdsi.webapi.util.PreparedStatementRenderer.getSql(PreparedStatementRenderer.java:293)
	at org.ohdsi.webapi.service.VocabularyService.getInfo(VocabularyService.java:1202)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

We have also tried with the DatabricksJDBC42.jar driver, but in this case, nothing works. Is the configuration we are using correct? How can we avoid this error?

Best regards and thank you.

t