Azure Databricks Cookbook
eBook - ePub

Azure Databricks Cookbook

Phani Raj, Vinod Jaiswal

Buch teilen
  1. 452 Seiten
  2. English
  3. ePUB (handyfreundlich)
  4. Über iOS und Android verfügbar
eBook - ePub

Azure Databricks Cookbook

Phani Raj, Vinod Jaiswal

Angaben zum Buch
Buchvorschau
Inhaltsverzeichnis
Quellenangaben

Über dieses Buch

Get to grips with building and productionizing end-to-end big data solutions in Azure and learn best practices for working with large datasets

Key Features

  • Integrate with Azure Synapse Analytics, Cosmos DB, and Azure HDInsight Kafka Cluster to scale and analyze your projects and build pipelines
  • Use Databricks SQL to run ad hoc queries on your data lake and create dashboards
  • Productionize a solution using CI/CD for deploying notebooks and Azure Databricks Service to various environments

Book Description

Azure Databricks is a unified collaborative platform for performing scalable analytics in an interactive environment. The Azure Databricks Cookbook provides recipes to get hands-on with the analytics process, including ingesting data from various batch and streaming sources and building a modern data warehouse.The book starts by teaching you how to create an Azure Databricks instance within the Azure portal, Azure CLI, and ARM templates. You'll work through clusters in Databricks and explore recipes for ingesting data from sources, including files, databases, and streaming sources such as Apache Kafka and EventHub. The book will help you explore all the features supported by Azure Databricks for building powerful end-to-end data pipelines. You'll also find out how to build a modern data warehouse by using Delta tables and Azure Synapse Analytics. Later, you'll learn how to write ad hoc queries and extract meaningful insights from the data lake by creating visualizations and dashboards with Databricks SQL. Finally, you'll deploy and productionize a data pipeline as well as deploy notebooks and Azure Databricks service using continuous integration and continuous delivery (CI/CD).By the end of this Azure book, you'll be able to use Azure Databricks to streamline different processes involved in building data-driven apps.

What you will learn

  • Read and write data from and to various Azure resources and file formats
  • Build a modern data warehouse with Delta Tables and Azure Synapse Analytics
  • Explore jobs, stages, and tasks and see how Spark lazy evaluation works
  • Handle concurrent transactions and learn performance optimization in Delta tables
  • Learn Databricks SQL and create real-time dashboards in Databricks SQL
  • Integrate Azure DevOps for version control, deploying, and productionizing solutions with CI/CD pipelines
  • Discover how to use RBAC and ACLs to restrict data access
  • Build end-to-end data processing pipeline for near real-time data analytics

Who this book is for

This recipe-based book is for data scientists, data engineers, big data professionals, and machine learning engineers who want to perform data analytics on their applications. Prior experience of working with Apache Spark and Azure is necessary to get the most out of this book.

]]>

Häufig gestellte Fragen

Wie kann ich mein Abo kündigen?
Gehe einfach zum Kontobereich in den Einstellungen und klicke auf „Abo kündigen“ – ganz einfach. Nachdem du gekündigt hast, bleibt deine Mitgliedschaft für den verbleibenden Abozeitraum, den du bereits bezahlt hast, aktiv. Mehr Informationen hier.
(Wie) Kann ich Bücher herunterladen?
Derzeit stehen all unsere auf Mobilgeräte reagierenden ePub-Bücher zum Download über die App zur Verfügung. Die meisten unserer PDFs stehen ebenfalls zum Download bereit; wir arbeiten daran, auch die übrigen PDFs zum Download anzubieten, bei denen dies aktuell noch nicht möglich ist. Weitere Informationen hier.
Welcher Unterschied besteht bei den Preisen zwischen den Aboplänen?
Mit beiden Aboplänen erhältst du vollen Zugang zur Bibliothek und allen Funktionen von Perlego. Die einzigen Unterschiede bestehen im Preis und dem Abozeitraum: Mit dem Jahresabo sparst du auf 12 Monate gerechnet im Vergleich zum Monatsabo rund 30 %.
Was ist Perlego?
Wir sind ein Online-Abodienst für Lehrbücher, bei dem du für weniger als den Preis eines einzelnen Buches pro Monat Zugang zu einer ganzen Online-Bibliothek erhältst. Mit über 1 Million Büchern zu über 1.000 verschiedenen Themen haben wir bestimmt alles, was du brauchst! Weitere Informationen hier.
Unterstützt Perlego Text-zu-Sprache?
Achte auf das Symbol zum Vorlesen in deinem nächsten Buch, um zu sehen, ob du es dir auch anhören kannst. Bei diesem Tool wird dir Text laut vorgelesen, wobei der Text beim Vorlesen auch grafisch hervorgehoben wird. Du kannst das Vorlesen jederzeit anhalten, beschleunigen und verlangsamen. Weitere Informationen hier.
Ist Azure Databricks Cookbook als Online-PDF/ePub verfügbar?
Ja, du hast Zugang zu Azure Databricks Cookbook von Phani Raj, Vinod Jaiswal im PDF- und/oder ePub-Format sowie zu anderen beliebten Büchern aus Computer Science & Data Modelling & Design. Aus unserem Katalog stehen dir über 1 Million Bücher zur Verfügung.

Information

Chapter 1: Creating an Azure Databricks Service

Azure Databricks is a high-performance Apache Spark-based platform that has been optimized for the Microsoft Azure cloud.
It offers three environments for building and developing data applications:
  • Databricks Data Science and Engineering: This provides an interactive workspace that enables collaboration between data engineers, data scientists, machine learning engineers, and business analysts and allows you to build big data pipelines.
  • Databricks SQL: This allows you to run ad hoc SQL queries on your data lake and supports multiple visualization types to explore your query results.
  • Databricks Machine Learning: Provides end-to-end machine learning environment for feature development, model training , experiment tracking along with model serving and management.
In this chapter, we will cover how to create an Azure Databricks service using the Azure portal, Azure CLI, and ARM templates. We will learn about different types of clusters available in Azure Databricks, how to create jobs, and how to use a personal access token (PAT) to authenticate Databricks.
By the end of this chapter, you will have learned how to create a Databricks service using the Azure portal, Azure CLI, and ARM templates and how to create different types of clusters. You will also start working with Notebooks and scheduling them as Databricks jobs. Finally, you will understand how to authenticate to Azure Databricks using a PAT.
We're going to cover the following recipes:
  • Creating a Databricks service in the Azure portal
  • Creating a Databricks service using the Azure CLI (Command-Line Interface)
  • Creating a Databricks Service using Azure Resource Manager (ARM) templates
  • Adding users and groups to the workspace
  • Creating a cluster from the user interface (UI)
  • Getting started with Notebooks and jobs in Azure Databricks
  • Authenticating to Databricks using a personal access token (PAT)

Technical requirements

To follow along with the examples in this chapter, you will need to have the following:
  • An Azure subscription
  • The Azure CLI installed on one of the following platforms:
    (a) For Windows (install the Azure CLI for Windows | Microsoft Docs https://docs.microsoft.com/en-us/cli/azure/install-azure-cli-windows?tabs=azure-cli)
    (b) For Mac (install the Azure CLI for macOS | Microsoft Docs https://docs.microsoft.com/en-us/cli/azure/install-azure-cli-macos)
    (c) For Linux (install the Azure CLI for Linux manually | Microsoft Docs https://docs.microsoft.com/en-us/cli/azure/install-azure-cli-linux?pivots=apt)
  • You can find the scripts for this chapter at https://github.com/PacktPublishing/Azure-Databricks-Cookbook/tree/main/Chapter01. The Chapter-01 folder contains the script for this chapter.
  • As an Azure AD user, you will need the Contributor role in your subscription and create the Azure Databricks service via the Azure portal. You must also be the admin of the Azure Databricks workspace.
  • The latest version of Power BI Desktop (https://www.microsoft.com/en-in/download/details.aspx?id=58494). We will use this to access Spark tables using PAT authentication.

Creating a Databricks workspace in the Azure portal

There are multiple ways we can create an Azure Databricks service. This recipe will focus on creating the service in the Azure portal. This method is usually used for learning purposes or ad hoc requests. The preferred methods for creating services are using the Azure PowerShell, Azure CLI and ARM templates.
By the end of this recipe, you will have learned how to create an Azure Databricks service instance using the Azure portal.

Getting ready

You will need access via a subscription to the service and have a Contributor role available in it.

How to do it…

Follow these steps to create a Databricks service using the Azure portal:
  1. Log into the Azure portal (https://port...

Inhaltsverzeichnis