Book description
The premier open source ETL tool is at your command with this recipe-packed cookbook. Learn to use data sources in Kettle, avoid pitfalls, and dig out the advanced features of Pentaho Data Integration the easy way.
- Intergrate Kettle in integration with other components of the Pentaho Business Intelligence Suite, to build and publish Mondrian schemas,create reports, and populatedashboards
- This book contains an organized sequence of recipes packed with screenshots, tables, and tips so you can complete the tasks as efficiently as possible
- Manipulate your data by exploring, transforming, validating, integrating, and performing data analysis
In Detail
Pentaho Data Integration is the premier open source ETL tool, providing easy, fast, and effective ways to move and transform data. While PDI is relatively easy to pick up, it can take time to learn the best practices so you can design your transformations to process data faster and more efficiently. If you are looking for clear and practical recipes that will advance your skills in Kettle, then this is the book for you.
Pentaho Data Integration Cookbook Second Edition guides you through the features of explains the Kettle features in detail and provides easy to follow recipes on file management and databases that can throw a curve ball to even the most experienced developers.
Pentaho Data Integration Cookbook Second Edition provides updates to the material covered in the first edition as well as new recipes that show you how to use some of the key features of PDI that have been released since the publication of the first edition. You will learn how to work with various data sources – from relational and NoSQL databases, flat files, XML files, and more. The book will also cover best practices that you can take advantage of immediately within your own solutions, like building reusable code, data quality, and plugins that can add even more functionality.
Pentaho Data Integration Cookbook Second Edition will provide you with the recipes that cover the common pitfalls that even seasoned developers can find themselves facing. You will also learn how to use various data sources in Kettle as well as advanced features.
Table of contents
-
Pentaho Data Integration Cookbook Second Edition
- Table of Contents
- Pentaho Data Integration Cookbook Second Edition
- Credits
- About the Author
- About the Reviewers
- www.PacktPub.com
- Preface
-
1. Working with Databases
- Introduction
- Connecting to a database
- Getting data from a database
- Getting data from a database by providing parameters
- Getting data from a database by running a query built at runtime
- Inserting or updating rows in a table
- Inserting new rows where a simple primary key has to be generated
- Inserting new rows where the primary key has to be generated based on stored values
- Deleting data from a table
- Creating or altering a database table from PDI (design time)
- Creating or altering a database table from PDI (runtime)
- Inserting, deleting, or updating a table depending on a field
- Changing the database connection at runtime
- Loading a parent-child table
- Building SQL queries via database metadata
- Performing repetitive database design tasks from PDI
-
2. Reading and Writing Files
- Introduction
- Reading a simple file
- Reading several files at the same time
- Reading semi-structured files
- Reading files having one field per row
- Reading files with some fields occupying two or more rows
- Writing a simple file
- Writing a semi-structured file
- Providing the name of a file (for reading or writing) dynamically
- Using the name of a file (or part of it) as a field
- Reading an Excel file
- Getting the value of specific cells in an Excel file
- Writing an Excel file with several sheets
- Writing an Excel file with a dynamic number of sheets
- Reading data from an AWS S3 Instance
- 3. Working with Big Data and Cloud Sources
-
4. Manipulating XML Structures
- Introduction
- Reading simple XML files
- Specifying fields by using the Path notation
- Validating well-formed XML files
- Validating an XML file against DTD definitions
- Validating an XML file against an XSD schema
- Generating a simple XML document
- Generating complex XML structures
- Generating an HTML page using XML and XSL transformations
- Reading an RSS Feed
- Generating an RSS Feed
- 5. File Management
-
6. Looking for Data
- Introduction
- Looking for values in a database table
- Looking for values in a database with complex conditions
- Looking for values in a database with dynamic queries
- Looking for values in a variety of sources
- Looking for values by proximity
- Looking for values by using a web service
- Looking for values over intranet or the Internet
- Validating data at runtime
-
7. Understanding and Optimizing Data Flows
- Introduction
- Splitting a stream into two or more streams based on a condition
- Merging rows of two streams with the same or different structures
- Adding checksums to verify datasets
- Comparing two streams and generating differences
- Generating all possible pairs formed from two datasets
- Joining two or more streams based on given conditions
- Interspersing new rows between existent rows
- Executing steps even when your stream is empty
- Processing rows differently based on the row number
- Processing data into shared transformations via filter criteria and subtransformations
- Altering a data stream with Select values
- Processing multiple jobs or transformations in parallel
-
8. Executing and Re-using Jobs and Transformations
- Introduction
- Launching jobs and transformations
- Executing a job or a transformation by setting static arguments and parameters
- Executing a job or a transformation from a job by setting arguments and parameters dynamically
- Executing a job or a transformation whose name is determined at runtime
- Executing part of a job once for every row in a dataset
- Executing part of a job several times until a condition is true
- Moving part of a transformation to a subtransformation
- Using Metadata Injection to re-use transformations
-
9. Integrating Kettle and the Pentaho Suite
- Introduction
- Creating a Pentaho report with data coming from PDI
- Creating a Pentaho report directly from PDI
- Configuring the Pentaho BI Server for running PDI jobs and transformations
- Executing a PDI transformation as part of a Pentaho process
- Executing a PDI job from the Pentaho User Console
- Generating files from the PUC with PDI and the CDA plugin
- Populating a CDF dashboard with data coming from a PDI transformation
-
10. Getting the Most Out of Kettle
- Introduction
- Sending e-mails with attached files
- Generating a custom logfile
- Running commands on another server
- Programming custom functionality
- Generating sample data for testing purposes
- Working with JSON files
- Getting information about transformations and jobs (file-based)
- Getting information about transformations and jobs (repository-based)
- Using Spoon's built-in optimization tools
- 11. Utilizing Visualization Tools in Kettle
- 12. Data Analytics
- A. Data Structures
- B. References
- Index
Product information
- Title: Pentaho Data Integration Cookbook Second Edition
- Author(s):
- Release date: December 2013
- Publisher(s): Packt Publishing
- ISBN: 9781783280674
You might also like
book
Pentaho Data Integration Beginner's Guide
Extract, Transform, and Load (ETL) is the essence of data integration and this book shows you …
book
Pentaho Data Integration Quick Start Guide
Get productive quickly with Pentaho Data Integration Key Features Take away the pain of starting with …
book
Pentaho® Kettle Solutions: Building Open Source ETL Solutions with Pentaho Data Integration
A complete guide to Pentaho Kettle, the Pentaho Data lntegration toolset for ETL This practical book …
book
Pentaho® Solutions: Business Intelligence and Data Warehousing with Pentaho and MySQL®
Your all-in-one resource for using Pentaho with MySQL for Business Intelligence and Data Warehousing Open-source Pentaho …