Pentaho Data Integration Quick Start Guide
eBook - ePub

Pentaho Data Integration Quick Start Guide

Create ETL processes using Pentaho

María Carina Roldán

Share book
  1. 178 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

Pentaho Data Integration Quick Start Guide

Create ETL processes using Pentaho

María Carina Roldán

Book details
Book preview
Table of contents
Citations

About This Book

Get productive quickly with Pentaho Data Integration

Key Features

  • Take away the pain of starting with a complex and powerful system
  • Simplify your data transformation and integration work
  • Explore, transform, and validate your data with Pentaho Data Integration

Book Description

Pentaho Data Integration(PDI) is an intuitive and graphical environment packed with drag and drop design and powerful Extract-Transform-Load (ETL) capabilities. Given its power and flexibility, initial attempts to use the Pentaho Data Integration tool can be difficult or confusing. This book is the ideal solution.

This book reduces your learning curve with PDI. It provides the guidance needed to make you productive, covering the main features of Pentaho Data Integration. It demonstrates the interactive features of the graphical designer, and takes you through the main ETL capabilities that the tool offers.

By the end of the book, you will be able to use PDI for extracting, transforming, and loading the types of data you encounter on a daily basis.

What you will learn

  • Design, preview and run transformations in Spoon
  • Run transformations using the Pan utility
  • Understand how to obtain data from different types of files
  • Connect to a database and explore it using the database explorer
  • Understand how to transform data in a variety of ways
  • Understand how to insert data into database tables
  • Design and run jobs for sequencing tasks and sending emails
  • Combine the execution of jobs and transformations

Who this book is for

This book is for software developers, business intelligence analysts, and others involved or interested in developing ETL solutions, or more generally, doing any kind of data manipulation.

Frequently asked questions

How do I cancel my subscription?
Simply head over to the account section in settings and click on “Cancel Subscription” - it’s as simple as that. After you cancel, your membership will stay active for the remainder of the time you’ve paid for. Learn more here.
Can/how do I download books?
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
What is the difference between the pricing plans?
Both plans give you full access to the library and all of Perlego’s features. The only differences are the price and subscription period: With the annual plan you’ll save around 30% compared to 12 months on the monthly plan.
What is Perlego?
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Do you support text-to-speech?
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Is Pentaho Data Integration Quick Start Guide an online PDF/ePUB?
Yes, you can access Pentaho Data Integration Quick Start Guide by María Carina Roldán in PDF and/or ePUB format, as well as other popular books in Business & Business Intelligence. We have over one million books available in our catalogue for you to explore.

Information

Year
2018
ISBN
9781789342796
Edition
1

Transforming Data

Transforming data is about manipulating the data that flows from step to step in a PDI transformation. There are many ways in which this transformation can be done. We can modify incoming data, change its datatype, add new fields, fix erroneous data, sort, group, and filter unwanted information, aggregate data in several ways, and more. In this chapter we will explain some of the possibilities.
The following is the list of topics that we will cover:
  • Transforming data in different ways
  • Sorting and aggregating data
  • Filtering rows
  • Looking up for data

Transforming data in different ways

So far, we have seen how to create a PDI dataset mainly using data coming from files or databases. Once you have the data, there are many things you can do with it depending on your particular needs. One very common requirement is to create new fields where the values are based on the values of existent fields.
The set of operations covered in this section is not a full list of the available options, but includes the most common ones, and will inspire you when you come to implement others.
The files that we will use in this section were built with data downloaded from www.numbeo.com, a site containing information about living conditions in cities and countries worldwide.
For learning the topics in this chapter, you are free to create your own data. However, if you want to reproduce the exercises exactly as they are explained, you will need the afore mentioned files from www.numbeo.com.
Before continuing, make sure you download the set of data that comes with the code bundle for the book.

Extracting data from existing fields

First, we will learn how to extract data from fields that exist in our dataset in order to generate new fields. For the first exercise, we will read a file containing data about the cost of living in Europe. The content of the file looks like this:
Rank City Cost of Living Index Rent Index Cost of Living Plus Rent Index Groceries Index Restaurant Price Index Local Purchasing Power Index
1 Zurich, Switzerland 141.25 66.14 105.03 149.86 135.76 142.70
2 Geneva, Switzerland 134.83 71.70 104.38 138.98 129.74 130.96
3 Basel, Switzerland 130.68 49.68 91.61 127.54 127.22 139.01
4 Bern, Switzerland 128.03 43.57 87.30 132.70 119.48 112.71
5 Lausanne, Switzerland 127.50 52.32 91.24 126.59 132.12 127.95
6 Reykjavik, Iceland 123.78 57.25 91.70 118.15 133.19 88.95
...
As you can see, the city field also contains the country name. The purpose of this exercise is to extract the country name from this field. In order to do this, we will go through the following steps:
  1. Create a new transformation and use a Text file input step to read the cost_of_living_europe.txt file.
  2. Drag a Split Fields step from the Transform category and create a hop from the Text file input towards the Split Fields step.
  1. Double-click the step and configure it, as shown in the following screenshot:
Configuring a Split Fields step
  1. Close the window and run a preview. You will see the following:
Previewing a transformation
As you can see, the Split Fields step can be used to split the value of a field into two or more new fields. This step is perfect for the purpose of obtaining the country name because the values were easy to parse. We had a value, then a comma, then another value. This is not always the case, but PDI has other steps for doing similar tasks. Let's look at another method for extracting pieces from a field.
This time, we will read a file containing common daily food items and their prices. The file has two fields—food and price—and looks as follows:
Food Price
Milk (regular), (0.25 liter) 0.19 €
Loaf of Fresh White Bread (125.00 g) 0.24 €
Rice (white), (0.10 kg) 0.09 €
Eggs (regular) (2.40) 0.33 €
Local Cheese (0.10 kg) 0.89 €
Chicken Breasts (Boneless, Skinless), (0.15 kg) 0.86 €
...
Suppose that we want to split the Food field into three fields for the name, quantity, and number of units respectively. Taking the value in the first row, Milk (regular), (0.25 liter), as an example, the name would be Milk (regular), the quantity would be 0.25, and the unit would be liter. We cannot solve this as we did before, but we can use regular expressions instead. In this case, the expression to use will be (.+)\(([0-9.]+)( liter| g| kg| head|)\).*.
Let's try it using the following steps:
  1. Create a new transformation and use a Text file input step to read the recommended_food.txt file.
In order to define the Price as a number, use the format #.00 €.
  1. Drag a Regex Evaluation step from the Scripting category and create a hop from the Text file input toward this new step.
  1. Double-click the step and configure it as shown in the following screenshot. Don't forget to check the Create fields for capture groups option:
Configuring a Regex Evaluation step
  1. Close the window and run a preview. You will see the following:
Previewing a transformation
The RegEx Evaluation step can be used just to evaluate whether or not a field matches a regular expression, or to generate new fields, as in this case. By capturing groups, we were able to create a new field for each group captured from the original field. You will also notice a field named result, which in our example has a Y as its value. This Y mean...

Table of contents