Chapter 7. Importing and Transforming Data

Transact-SQL-based systems often interact with much older, legacy systems. Data that is imported from such sources is often inefficiently structured and poorly organized. Usually after importing such data, you want to transform it into a structure that can be handled efficiently and that will easily support a wide variety of queries.

This chapter introduces concepts and recipes useful for working with imported data tables and their transformations. We show data normalization and other techniques for transforming badly structured data into a form more acceptable for SQL processing. Denormalized data is usually found when importing from nonrelational systems or from files designed for human use. In general, the more human-readable the data is, the less efficiently it can be manipulated using SQL statements.

Data rows can be inserted into SQL tables by using either the INSERT statement or an embedded import mechanism, such as the BULK INSERT statement. Once the rows are in SQL Server, they can be manipulated using standard SQL commands.

In this chapter, we try to find a balance between the readability and efficiency of data tables. We discuss general techniques of folding and pivoting that can be used to transform human-readable data into SQL-friendly tables and back again. We show some techniques that you can use to ensure smooth linking between legacy data systems and SQL Server-based systems. We suggest some steps that you can follow to ...

Get Transact-SQL Cookbook now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.