This post is how to work with Databricks SQL through a Notebook.
Create a Temp View of a DataFrame.
df = <SOMETHING> df.createOrReplaceTempView("<TABLE_NAME>")
Drop a Table
%sql drop table <SCHEMA>.<TABLE>;
Describe Table
%sql desc table extended <SCHEMA>.<TABLE>;
Describe Detail
%sql describe detail <SCHEMA>.<TABLE>;
Show Table Properties
%sql SHOW TBLPROPERTIES <SCHEMA>.<TABLE>;
Describe History
%sql describe history <SCHEMA>.<TABLE>;
Create Schema
%sql CREATE SCHEMA IF NOT EXISTS <SCHEMA>;
Create Parquet Table
%sql CREATE TABLE <SCHEMA>.<TABLE> USING PARQUET LOCATION 'abfss://<COTNAINER>@<STORAGE_ACCOUNT>.dfs.core.windows.net/<FOLDER>/'
Create Delta Table
%sql CREATE TABLE <SCHEMA>.<TABLE> USING DELTA LOCATION 'abfss://<COTNAINER>@<STORAGE_ACCOUNT>.dfs.core.windows.net/<FOLDER>/'
Upsert
MERGE INTO schema.table t \ USING ( \ SELECT columns \ FROM table \ ) AS source ON (source.column = t.column) \ WHEN NOT MATCHED THEN \ INSERT ( \ ( \ column, column2 \ ) \ VALUES ( \ source.column, source.column2 \ ) \ WHEN MATCHED THEN \ UPDATE SET \ t.column = source.column \