SAP - Course Details

DI:ETLD(BOI205-E) Data Integrator: Extracting, Transforming and Loading Data (BOI205-E)

SAP Course Description:
After completing the course, learners will be able to: -Describe BusinessObjects Data Integrator architecture -Define source and target metadata to import into your ETL jobs -Create a batch job -Validate, trace and debug batch jobs. -Use built-in Transforms and Functions to support data flow movement requirements -Optimize data flow -Use variables, parameters and Data Integrator Scripting Language -Capture Changes in Data -Handle errors and exceptions -Support a multi-user environment, administer batch jobs and migrate projects using datastore and system configurations -Manage metadata
SAP Course Duration:
Self Placed
SAP Course Target Audience:
This course is designed for individuals responsible for implementing projects involving the extraction, transformation and loading of data in batch jobs, administering and managing projects that involve Data Integrator.
SAP Course Prerequisite:
Knowledge of data warehousing concepts,Experience with Microsoft SQL Server,Knowledge of normal forms of data and SQL language,Experience using functions, elementary procedural programming and flow-of-control statements. For example: If, Else, While, Loop
Course Content:
  • Data Warehousing Concepts 
       -Describe normal forms of data 
       -Explain dimensional modeling

  • Understanding Data Integrator 
       -Understand Data Integrator Architecture and Interface and development process 
       -Define objects 
       -Explain relationships between objects, projects and jobs 
       -Describe the Data Integrator development process

  • Defining Source and Target Metadata 
       -Use datastores 
       -Import metadata 
       -Define a file format

  • Creating a Batch Job 
       -Create a batch job 
       -Create a simple data flow 
       -Add source and target objects to a data flow 
       -Use the Query transform 
       -Execute the job 
       -Add a new table to a data flow using template tables

  • Validating, Executing and Debugging Jobs 
       -Use descriptions and annotations 
       -Validate and trace jobs 
       -Debug jobs

  • Using built-in Transforms 
       -Describe built-in transforms 
       -Create an embedded data flow

  • Using built-in Functions 
       -Define and use built-in functions 
       -Use functions in expressions

  • Optimizing Data Flow 
       -Optimize source and target based performance 
       -Optimize job performance 
       -Understand table partitioning and parallel execution in data flows

  • Using Variables, Parameters and Scripting 
       -Understand Variables,

  • Parameters and DI Scripting Language 
       -Script a custom function

  • Capturing Changes in Data 
       -Use source-based and targetbased

  • Changed Data Capture 
       -Handling Errors and Exceptions 
       -Understand recovery systems 
       -Process data with problems

  • Supporting a multi-user environment 
       -Set up and work in a multiuser environment 
       -Describe common tasks in a multi-user environment

  • Migrating Projects between Design, Test and Production Phases 
       -Understand migration mechanisms and tools 
       -Use datastore configurations and migration 
       -Migrate a multi-user and single user environment 
       -Use datastore configurations to improve job portability

  • Using the Web Administrator 
       -Use the Web Administrator 
       -Implement central repository security 
       -Manage batch jobs with the Administrator 
       -Understand server groups

  • Managing metadata 
       -Understand metadata 
       -Use metadata reporting 
       -Use the Metadata Reporting tool