From the course: Azure Spark Databricks Essential Training

Unlock the full course today

Join today to access over 22,600 courses taught by industry experts or purchase this course individually.

Use a Python notebook with dashboards

Use a Python notebook with dashboards

From the course: Azure Spark Databricks Essential Training

Start my 1-month free trial

Use a Python notebook with dashboards

- [Instructor] After you've created your cluster, you'll use the Databricks console, primarily the GUI interface and notebooks to interact with it. If you have the premium edition, you can also use the Databricks CLI. You're going to want to use the Databricks as your console to interact with the Databricks Spark clusters, rather than the core Azure console. That's really just used primarily in the set up stage. Of course the whole point of using Databricks Azure, is to process data. So you might be wondering, what data sources are supported? One of the reasons for using Azure Databricks, is the high number of integratable data sources, and you can see them listed here. Now this is constantly being enhanced and added to, so this is what's available at the time of this recording. Notably, you have things like Azure Blob Storage, you have Azure Data Lakes, Cosmos DB, Azure SQL Data Warehouse, and many other opensource databases, such as MongoDB, and Neo4j, a lot of different types of…

Contents