This post is how to connect to oracle from JDBC.
There are four main connection examples:
TCP to Single Host
jdbc:oracle:thin:@(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=<HOST>)(PORT=<PORT>))(CONNECT_DATA=(SERVICE_NAME=<SERVICE_NAME>)))
TCP to Multiple Hosts
jdbc:oracle:thin:@(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=<HOST1>)(PORT=<PORT1>))(ADDRESS=(PROTOCOL=TCP)(HOST=<HOST2>)(PORT=<PORT2>))(CONNECT_DATA=(SERVICE_NAME=<SERVICE_NAME>)))
SSL to Single Host
jdbc:oracle:thin:@(DESCRIPTION=(ADDRESS=(PROTOCOL=TCPS)(HOST=<HOST>)(PORT=<PORT>))(CONNECT_DATA=(SERVICE_NAME=<SERVICE_NAME>)))
SSL to Multiple Hosts
jdbc:oracle:thin:@(DESCRIPTION=(ADDRESS=(PROTOCOL=TCPS)(HOST=<HOST1>)(PORT=<PORT1>))(ADDRESS=(PROTOCOL=TCPS)(HOST=<HOST2>)(PORT=<PORT2>))(CONNECT_DATA=(SERVICE_NAME=<SERVICE_NAME>)))
Connect Using PySpark With Databricks
You will need to get the secret from a keyvault. Refer to Databricks: Get Secret
secret = <Get_Secret> df = spark.read \ .format("jdbc") \ .option("url", "<URL>") \ .option("dbtable", "<SCHEMA>.<TABLE>") \ .option("user", "<ACCOUNT>") \ .option("password", secret) \ .option("driver", "oracle.jdbc.driver.OracleDriver") \ .laod()
Connect Using SQL With Databricks
It should be noted that this isn’t a really good way and I don’t believe is completely secure.
Add cluster configuration
spark.databricks.secureVariableSubstitute.enabled false spark.<SOME_VARIABLE_NAME> {{secrets/<SCOPE>/<SECRET>}}
Code
%sql CREATE SCHEMA IF NOT EXISTS <SCHEMA_NAME>; DROP VIEW IF EXISTS <VIEW_NAME>; CREATE TEMPORARY VIEW <SCHEMA_NAME>.<VIEW_NAME> USING org.apache.spark.sql.jdbc OPTIONS ( url "<URL>", dbtable "<SCHEMA_NAME>.<TABLE_NAME>", user "<ACCOUNT>", password "${spark.<SOME_VARIABLE_NAME>}", driver "oracle.jdbc.driver.OracleDriver" )