Databricks: Notebook SQL

(Last Updated On: )

This post is how to work with Databricks SQL through a Notebook.

Create a Temp View of a DataFrame.

  1. df = <SOMETHING>
  2. df.createOrReplaceTempView("<TABLE_NAME>")

Drop a Table

  1. %sql
  2. drop table <SCHEMA>.<TABLE>;

Describe Table

  1. %sql
  2. desc table extended <SCHEMA>.<TABLE>;

Describe Detail

  1. %sql
  2. describe detail <SCHEMA>.<TABLE>;

Show Table Properties

  1. %sql
  2. SHOW TBLPROPERTIES <SCHEMA>.<TABLE>;

Describe History

  1. %sql
  2. describe history <SCHEMA>.<TABLE>;

Create Schema

  1. %sql
  2. CREATE SCHEMA IF NOT EXISTS <SCHEMA>;

Create Parquet Table

  1. %sql
  2. CREATE TABLE <SCHEMA>.<TABLE> USING PARQUET LOCATION 'abfss://<COTNAINER>@<STORAGE_ACCOUNT>.dfs.core.windows.net/<FOLDER>/'

Create Delta Table

  1. %sql
  2. CREATE TABLE <SCHEMA>.<TABLE> USING DELTA LOCATION 'abfss://<COTNAINER>@<STORAGE_ACCOUNT>.dfs.core.windows.net/<FOLDER>/'

Upsert

  1. MERGE INTO schema.table t \
  2. USING ( \
  3. SELECT columns \
  4. FROM table \
  5. ) AS source ON (source.column = t.column) \
  6. WHEN NOT MATCHED THEN \
  7. INSERT ( \
  8. ( \
  9. column, column2 \
  10. ) \
  11. VALUES ( \
  12. source.column, source.column2 \
  13. ) \
  14. WHEN MATCHED THEN \
  15. UPDATE SET \
  16. t.column = source.column \