Monday, May 9, 2011

Performance Testing Citrix Presentation Server™ Platform

1 comments
Citrix Presentation Server™ provides dynamic solutions to information access issues, but presents potential complications that are unique to this platform.

While you are confident that your decision to use Citrix will provide cost-efficient accessibility to necessary applications, your business and technical teams are wondering whether it will scale. 

What do you do? 

Conventional scalability/load/performance tools will not work with applications built with Citrix. Why? Because these tools have been built to capture communication at the protocol level. 
Citrix communicates through its proprietary ICA protocol. Few of the conventional tool vendors have a testing solution for this protocol. 

RTTS, the leader in test automation, provides scalability/load/performance testing of Citrix Presentation Server and the ICA architecture that circumvents this issue. 
We implement the leading edge Citrix testing tools that utilize unique, event and image-driven verification and synchronization of objects from the user perspective.

 
The test tool has these capabilities: 

 Recognize and evaluate the targeted application's response during test script playback by comparing the application's display to images stored in a user-created baseline. 

 Interact with the target application by issuing standard keyboard and mouse events to the system under test in accordance with the user defined test script logic. 

 Use the tool to act as a "virtual" human user to detect, evaluate, measure, and respond to the on-screen activity based on the model represented by the test script logic. 

 Provide performance benchmarking response-timer functions that allow measurement of response times to within milliseconds. 

 Can scale up to as many concurrent users as the business demands. 
 
RTTS is able to deliver a Citrix testing service that incorporates planning, design, scripting, test execution and analysis of the performance of your Citrix environment. We provide all-inclusive model that incorporates short-term pricing for rental of the Citrix load testing tool and expert programming and analysis services to implement the tool and interpret the results. 

We can identify and manage system performance risk, helping to avoid downtime, improve company productivity and deliver reliable services.
newer post

ETL ARCHITECTURES – CONCEPTS AND IMPLEMENTATION

0 comments
ETL Architectures is an in-depth, technical course that teaches the concepts for designing and implementing the appropriate architectures to use in managing the extraction, transformation and loading (ETL) of data for:

High performance decision support environments (data warehouses, dimensional data marts, Operational Data Stores (ODS), etc)
Master Data Management hubs (Customer Data Integration (CDI), Product Information Management (PIM), etc)
General data integration (e.g. Service Oriented Architectures (SOA))
This course will review these architectures and concepts with the primary focus on the concepts and techniques that apply to various approaches to ETL. Participants will learn when to use certain techniques, based on their technical and business requirements. With hands-on workshops, attendees will study different ETL products and methodologies for implementation in today’s heterogeneous system environments.

Benefits To Your Company
By learning the best way to design ETL architectures, architects and ETL developers will be able to implement the appropriate tools and techniques to satisfy business requirements and relate them to the supporting data structures. They will:

Understand the concepts of extraction, transformation and loading in decision support systems, master data management systems, SOA environments.
Understand the various forms of data architectures and how to apply ETL techniques to these
Understand sophisticated techniques for more complicated ETL solutions (real-time, high volume, etc)
Construct ETL architectures that are flexible to support changing business and technical requirements
Learn about the most common ETL products and their strengths and weaknesses.
Who Should Attend
Data Warehouse Architects
Enterprise Architects (Data, Technical)
ETL Developers
Data Architects
Business Intelligence designers
Database designers
Database administrators (DBA)
What Makes This Certified Course Unique
This ICCP-certified course provides participants with practical, in-depth understanding of how to create appropriate ETL architectures for decision support and data integration solutions. Hands-on workshops throughout the course will reinforce the learning experience and provide the attendees with concrete results that can be utilized in their organizations.

Course Outline
Review common system architectures
Transaction Processing
Decision Support
Master Data Management
Service Oriented Architecture
ETL Concepts
General principles
Design and plan for reuse
Design for error handling
Design for performance
Design for maintainability
ETL Standards
ETL and Meta Data
ETL Tool Usage
ETL for Decision Support
ETL for the Data Warehouse
Data Sourcing / Changed Data Capture
Data Transport
Data Staging
Changed Data Determination
Loading normalized warehouse structures
ETL for the Data Mart
Surrogate key lookup and assignment
Slowly Changing Dimensions - Types 1,2, 3 & 6
Denormalization and impact on ETL
Populating “junk” dimensions using a Cartesian product
Aggregation
ETL for the ODS
Real/near time approaches
Data Modeling differences
Row level security
Closing the loop
ETL for Master Data Management (MDM) and Service Oriented Architectures (SOA)
Customer Data Integration (CDI)
Product Information Management (PIM)
Integrating ETL and SOA environments
Integrating ETL with Data Quality tools
Integration with OLTP systems
ETL Tools
Leading ETL tool vendors
ETL tool strengths / weaknesses
Choosing the correct ETL tool
High performance ETL
Indexing (b-tree, bitmap, join indexes, etc)
Forms of Parallelism
RDBMS tuning and ETL
Massively Parallel Processing (MPP) platforms vs. Symmetrical Multiprocessing (SMP) platforms
ETL query optimization
Workshop conclusion
Summary, additional exercises, sources for further reading, etc.
Standard Duration
3 days
To learn more about how EWSolutions can provide our World-Class Training for your company or to request a quote, please feel free to contact David Marco, our Director of Education at DMARCO@EWSOLUTIONS.COM or call him at 630.920.0005 ext. 103.
newer post

DataBase Tuning

0 comments
With the advent of the e-commerce insurrection, much focus has been directed towards the web-based technology that supports the Internet, such as Java, HTTP, Web Services, XML, etc. The common thread to all of the new technologies is their dependency upon relational databases to provide data essential to their business paradigm. Database platforms such as Oracle9i, IBM DB2, Microsoft SQL Server and Sybase Adaptive Server Enterprisetherefore provide the core foundation upon which business decisions are made and revenue is produced. 

Whether implementing an on-line transaction processing system (OLTP) or decision support system (DSS) within a standard client/server application or distributed web-based application, the requirements are the same:

1. retrieve the data as quick as possible, 
2. support hundreds or thousands of end-users, and 
3. keep hardware and software maintenance costs at a minimum. 

Database performance tuning is the iterative process of analyzing the ramifications of hardware and/or software configuration changes with the intent of increasing application performance while minimizing costs. 

RTTS has successfully assisted in many database performance tuning engagements. Armed with a proven testing methodology and test automation best practices, RTTS has provided an integral solution for resolving many issues associated with the relational database management system (RDBMS) and operating system kernel parameters. These include: 

 providing an inventory of slow or inefficient database queries 
 determining the proper size of connection pools to support the arrival rate of SQL requests 
 discovering the inability of a RDBMS to scale on a multiprocessor database server (RS/6000 SMP) 
 establishing the best configuration sizes for data and procedure caches 
ascertaining the best hardware platform to implement 
 discerning the most efficient auditing scheme that would prevent deadlocks, while maintaining history of the business processes 
 validating the correct indexes to employ, such as clustered indexes versus non-clustered indexes 

The database performance tuning realm also extends to web clients, such as application servers, and fat clients (i.e. Powerbuilder, Visual Basic, C++). RTTS has pinpointed issues relating to the manner in which data is requested and client/server communication is enabled. 

How does RTTS solve the problem? 

Regardless of the RDBMS that is implemented, RTTS has a solution for tuning database servers and their clients. Although the configurable parameter terminology differs by platform, the same performance tuning concepts apply to all database server vendors. 
1. Determine the level of tuning - Component-level tuning or system-level tuning? Do you want to tune the database server as an isolated component or as part of a larger application? 
2. Understand the end-user community - Gather metrics regarding the manner in which the database will be accessed. What SQL queries will be executed? What business transactions will be executed? How often are transactions executed? 
3. Gather performance requirements - Determining the exit criteria for tuning needs to be established in order to know when sufficient testing has occurred. 
4. Automate test scripts - Create automated test scripts that issue the necessary SQL queries, updates and deletes. Generate automated test scripts that emulate the business scenarios. 
5. Execute & analyze tests - Run the planned tests and collect metrics, such as response times, transaction volumes, operating system statistics, database server statistics. 
6. Application Profilers - Implement ancillary tools to profile transaction characteristics. Determine the network characteristics of a transaction, such as bandwidth utilization and conversational chattiness. Ascertain the CPU utilization on the database server and client, memory utilization, query compilation and execution times. 

The Solution 
As a result, database server capacity and scalability is increased by addressing: 
 the use of a small packet size between the client and the server 
 chatty conversation over high latency network links 
 large amounts of unused data returned to the client 
 redundant database queries 
 additional tuning methods 

Deliverables 
At the conclusion of the project, RTTS provides an Executive Summary report illustrating performance of your application and/or database server as quantified by response times, throughput, application and communication errors, system resources and capacity, as related to the particular database server tuning parameters. 

The engagement will also provide a suite of automated test scripts that can be used for future testing and tuning endeavors along with a set of best practices for approaching database server performance tuning.
newer post

ETL (Extract-Transform-Load) for Data Warehousing

1 comments

Stocking the data warehouse with data is often the most time consuming task needed to make data warehousing and business intelligence a success. In the overall scheme of things Extract-Transform-Load (ETL) often requires about 70 percent of the total effort.

Extracting data for the data warehouse includes:

    Making ETL Architecture Choices
    Data Mapping
    Extracting data to staging area
    Applying data cleansing transformations
    Applying data consistency transformations
    Loading data

  

Before starting the ETL step for the data warehousing and business intelligence project it is important to determine the business requirements. See the articleRequirements for Data Warehousing and Business Intelligence for more information.

Also, the data sources and targets must be defined. See articles Data Sources for Data Warehousing and Business Intelligence and Data Models for Data Warehousing and Business Intelligence to understand this.


Making ETL Architecture Choices for the Data Warehouse

ETL has a prominent place in data warehousing and business intelligence architecture.
Data Warehousing Architecture
The extract, transformation and loading process includes a number of steps:
ETL Data Warehousing Processes
Create your own diagrams that show the planned ETL architecture and the flow of data from source to target.
Selecting the right ETL Tools is critical to the success the data warehousing and business intelligence project. Should your company acquire a top of the line specialized ETL tool suite, use lower cost Open Source ETL, or use "Tools at Hand"? The article ETL Tool Selection for the Data Warehouse describes these options along with their pros and cons.
Consider these performance improvement methods:
  • Turn off database logging to avoid the overhead of log insertions
  • Load using a bulk load utility which does not log
  • Primary keys should be single integers 
  • Drop relational integrity (RI) / foreign keys - restore after load is complete
  • Drop indexes and re-build after load
  • Partition data leaving data loaded earlier unchanged
  • Load changed data only  - use "delta" processing
  • Avoid SQL Update with logging overhead - possibly drop rows and reload using bulk loader
  • Do a small number of updates with SQL Update, then use bulk load for inserts
  • Use Cyclic Redundancy Checksum (CRC) to detect changes in data rather than brute force method of comparing each column
  • Divide SQL Updates into groups to avoid a big rollback log being create.
  • Use an ETL tool that supports parallelism
  • Use an ETL tool that supports caching
  • Use RAID technologies
  • Use fast disk and controllers - 15,000 RPM
  • Dedicate servers and disk to business intelligence - do not share with other applications 
  • Use multiple servers to support BI such as: a database server, an analysis server and a reporting server
  • Use a server with large main memory (16 GB +) - this increases data caching and reduces physical data access 
  • Use a server with multiple processors / cores to enable greater parallelism

Data Mapping for Data Warehousing and Business Intelligence

A Data Map is specification that identifies data sources and targets as well as the mapping between them. The Data Map specification is created and reviewed with input by business Subject Material Experts (SMEs) who understand the data.
There are two levels of mapping, entity level and attribute level. Each target entity (table) will have a high level mapping description and will be supported by a detailed attribute level mapping specification.
Target Table Namedw_customer
Target Table DescriptionHigh level information about a customer such as name, customer type and customer status.
Source Table Namesdwprod1.dwstage.crm_cust
dwprod1.dwstage.ord_cust
Join Rulescrm_cust.custid = ord_cust.cust.cust_nbr
Filter Criteriacrm_cust.cust_type not = 7
Additional LogicN/A
Then for each attribute the attribute level data map specifies:
  • Source: table name, column name, datatype
  • Target: table name, column name, datatype
  • Transformation Rule
  • Notes
Attribute Level Data Map for Data Warehousing
Transformations may include:
  • Aggregate
  • Substring
  • Concatenate
  • Breakout Array Values / Buckets
newer post

DATA WAREHOUSE TESTING IS DIFFERENT

1 comments
All works in Data Warehouse population are mostly through batch runs. Therefore the testing is different from what is done in transaction systems.

Unlike a typical transaction system, data warehouse testing is different on the following counts:

User-Triggered vs. System triggered

Most of the production/Source system testing is the processing of individual transactions, which are driven by some input from the users (Application Form, Servicing Request.). There are very few test cycles, which cover the system-triggered scenarios (Like billing, Valuation.)

In data Warehouse, most of the testing is system triggered as per the scripts for ETL ('Extraction, Transformation and Loading'), the view refresh scripts etc.

Therefore typically Data-Warehouse testing is divided into two parts--> 'Back-end' testing where the source systems data is compared to the end-result data in Loaded area, and 'Front-end' testing where the user checks the data by comparing their MIS with the data displayed by the end-user tools like OLAP.

Batch vs. online gratification

This is something, which makes it a challenge to retain users interest.

A transaction system will provide instant OR at least overnight gratification to the users, when they enter a transaction, which either is processed online OR maximum via overnight batch. In the case of data- warehouse, most of the action is happening in the back-end and users have to trace the individual transactions to the MIS and views produced by the OLAP tools. This is the same challenge, when you ask users to test the month-end mammoth reports/financial statements churned out by the transaction systems.

Volume of Test Data

The test data in a transaction system is a very small sample of the overall production data. Typically to keep the matters simple, we include as many test cases as are needed to comprehensively include all possible test scenarios, in a limited set of test data..

Data Warehouse has typically large test data as one does try to fill-up maximum possible combination and permutations of dimensions and facts.

For example, if you are testing the location dimension, you would like the location-wise sales revenue report to have some revenue figures for most of the 100 cities and the 44 states. This would mean that you have to have thousands of sales transaction data at sales office level (assuming that sales office is lowest level of granularity for location dimension).

Possible scenarios/ Test Cases

If a transaction system has hundred (say) different scenarios, the valid and possible combination of those scenarios will not be unlimited. However, in case of Data Warehouse, the permutations and combinations one can possibly test is virtually unlimited due to the core objective of Data Warehouse is to allow all possible views of Data. In other words, 'You can never fully test a data Warehouse'

Therefore one has to be creative in designing the test scenarios to gain a high level of confidence.

Test Data Preparation

This is linked to the point of possible test scenarios and volume of data. Given that a data- warehouse needs lots of both, the effort required to prepare the same is much more.

Programming for testing challenge

In case of transaction systems, users/business analysts typically test the output of the system. However, in case of data warehouse, as most of the action is happening at the back-end, most of the 'Data Warehouse data Quality testing' and 'Extraction, Transformation and Loading' testing is done by running separate stand-alone scripts. These scripts compare pre-Transformation to post Transformation (say) comparison of aggregates and throw out the pilferages. Users roles come in play, when their help is needed to analyze the same (if designers OR business analysts are not able to figure it out).
newer post

Various Types of Testimg

0 comments
    XML Testing
    Network Latency Modeling
    Java Testing (J2EE/EJB)        
    Transaction Characterization
    Data Integrity Testing        
    Load/Scalability Testing
    GUI Testing        
    Performance Testing
    Issue/Defect Tracking        
    Stress Testing
    Requirements Management        
    Configuration Testing
    Interoperability Testing   
     Volume Testing
    Functional Testing        
    Concurrency Testing
    Integration Testing        
    Resource Usage Testing
    Web Site Monitoring        
    Infrastructure Testing
    SLA Testing        
    Component Testing
    Security Testing        
    Failover Testing
    Business Rules Testing        
    Reliability Testing
    COM+ Testing              


Testing TypeDescription
XML TestingValidation of XML data content on a transaction-by-transaction basis. Where desirable, validation of formal XML structure (metadata structure) may also be included.
Java Testing (EJB, J2EE)Direct exercise of class methods to validate that both object properties and methods properly reflect and handle data according to business and functional requirements of the layer. Exercise of transactions at this layer may be performed to measure both functional and performance characteristics
Data Integrity TestingValidation of system data at all data capture points in a system, including front-end, middle- or content-tier, and back-end database. Data integrity testing includes strategies to examine and validate data at all critical component boundaries.
GUI TestingValidation of GUI characteristics against GUI requirements.
Issue/Defect TrackingTracking software issues and defects is at the core of the software quality management process. Software quality can be assessed at any point in the development process by tracking numbers of defects and defect criticality. Software readiness-for-deployment can be analyzed by following defect trends for the duration of the project.
Requirements ManagementRequirements both define the shape of software (look-and-feel, functionality, business rules) and set a baseline for testing. As such, requirements management, or the orderly process of gathering requirements and keeping requirements documentation updated on a release-by- release basis, is critical to the deployment of quality software.
Interoperability TestingValidation that applications in a given platform configuration do not conflict, causing loss of functionality.
Functional TestingValidation of business requirements, GUI requirements and data handling in an application.
Security TestingValidation that security requirements of a system have been correctly implemented, including: resistance to password cracking, Denial of Service (DOS) attacks, and that known security flaws have been properly patched.
Business Rules TestingValidation that business rules have been properly implemented in a system, enforcing correct business practices on the user.
COM+ TestingDirect exercise of COM methods to validate that both object properties and methods properly reflect and handle data according to business and functional requirements of the COM layer. Exercise of transactions at this layer may be performed to measure both functional and performance characteristics.
Integration TestingTesting in which software components, hardware components, or both are combined and tested to evaluate the interaction between them.
Network Latency ModelingAnalysis of the fundamental amount of time it takes a given message to traverse a given distance across a specific network. This factor influences all messages that traverse a network, and is key in modeling network behavior.
Transaction CharacterizationDetermining the footprint of business transactions. This includes bandwidth on the network, CPU and memory utilization on back-end systems. Additionally used in Network Latency Modeling and Resource Usage Testing.
Load/Scalability TestingIncrease load on the target environment until requirements are exceeded or saturation of a resource. This is usually combined with other test types to optimize performance.
Performance TestingDetermining if the test environment meets requirements at set loads and mixes of transactions by testing specific business scenarios.
Stress TestingExercising the target system or environment at the point of saturation (depletion of a resource: CPU, memory, etc.) to determine if the behavior changes and possibly becomes detrimental to the system, application or data.
Configuration TestingEncompasses testing various system configurations to assess the requirements and resources needed.
Volume TestingDetermining the volume of transactions that a complete system can process. Volume Testing is conducted in conjunction with Component, Configuration and/or Stress Testing.
Resource Usage TestingMulti-user testing conducted beyond Transaction Characterization to determine the total resource usage of applications and subsystems or modules.
Concurrency TestingMulti-user testing geared towards determining the effects of accessing the same application code, module or database records. Identifies and measures the level of locking, deadlocking and use of single-threaded code and locking semaphores.
Infrastructure TestingVerifying and quantifying the flow of data through the environment infrastructure.
Component TestingThe appropriate tests are conducted against the components individually to verify that each individual component can support without failure. This testing is typically conducted while the environment is being assembled to identify any weak links.
Failover TestingIn environments that employ redundancy and load balancing, Failover Testing analyzes the theoretical failover procedure, tests and measures the overall failover process and its effects on the end-user.
Reliability TestingOnce the environment or application is working and optimized for performance, a longer period (24 to 48 hour) Reliability Test will determine if there are any long term detrimental issues that may effect performance in production.
SLA TestingSpecialized business transaction testing to measure Service Level Agreements with third party vendors. The typical agreement guarantees a specified volume of activity over a predetermined time period with a specified maximum response time.
Web Site MonitoringMonitoring business transaction response times after production deployment to ensure end-user satisfaction.
 

newer post

Testing the Data Warehouse

1 comments
Testing the data warehouse and business intelligence system is critical to success.  Without testing, the data warehouse could produce incorrect answers and quickly lose the faith of the business intelligence users. Effective testing requires putting together the right processes, people and technology and deploying them in productive ways.

Data Warehouse Testing Responsibilities

Who should be involved with testing?  The right team is essential to success:

Business Analysts gather and document requirements
QA Testers develop and execute test plans and test scripts
Infrastructure people set up test environments
Developers perform unit tests of their deliverables
DBAs test for performance and stress
Business Users perform functional tests including User Acceptance Tests (UAT)
Business Requirements and Testing

When should your project begin to think about testing?  The answer is simple - at the beginning of the project.  Successful testing begins with the gathering and documentation of requirements.  Without requirements there is no measure of system correctness.

Expect to produce a Requirements Traceability Matrix (RTM) that cross references data warehouse and business intelligence features to business requirements.  The RTM is a primary input to the Test Plan.

Data Warehousing Test Plan

The Test Plan, typically prepared by the QA Testers, describes the tests that must be performed to validate the data warehousing and business intelligence system.  It describes the types of tests and the coverage of required system features.

Test Cases are details that enable implementation of the Test Plan.  The Test Case itemizes steps that must be taken to test the system along with expect results.  A Text Execution Log tracks each test along with the results (pass or fail) of each test item.

Testing Environments and Infrastructure

Multiple environments must typically be created and maintained to support the system during its lifecycle:

Development
QA
Staging / Performance
Production
These kinds of tools can facilitate testing and problem correction:

Automated test tool
Test data generator
Test data masker
Defect manager
Automated test scripts
Unit Testing for the Data Warehouse

Developers perform tests on their deliverables during and after their development process.  The unit test is performed on individual components and is based on the developer's knowledge of what should be developed.

Unit testing should definitely be performed before deliverables are turned over to QA by developers.  Tested components are likely to have fewer bugs.

QA Testers Perform Many Types of Tests

QA Testers design and execute a number of tests:

Integration Test   
Test the systems operation from beginning to end, focusing on how data flows through the system.  This is sometimes called "system testing" or "end-to-end testing".

Regression Test    Validate that the system continues to function correctly after being changed.  Avoid "breaking" the system.


Can the Data Warehouse Perform?

Tests can be designed and executed that show how well the system performs with heavy loads of data:

Extract Performance Test

Test the performance of the system when extracting a large amount of data.

Transform and Load Performance Test   
Test the performance of the system when transforming and loading a large amount of data.  Testing with a high volume is sometimes called a "stress test".

Analytics Performance Test    Test the performance of the system when manipulating the data through calculations.

Business Users Test Business Intelligence

Does the system produce the results desired by business users?  The main concern is functionality, so business users perform functional tests to make sure that the system meets business requirements.  The testing is performed through the user interface (UI) which includes data exploration and reporting.

Correctness Test   
The system must be produce correct results.  The measures and supporting context need to match numbers in other systems and be calculated correctly.

Usability Test    The system should be as easy to use as possible.  It involves a controlled experiment about how business users can use the business intelligence system to reach stated goals.
Performance Test   
The system must be able to return results quickly without bogging down other resources.



Business Intelligence Must Be Believed

Quality must be baked into the data warehouse or users will quickly lose faith in the business intelligence produced.  It then becomes very difficult to get people back on board.

Putting the quality in requires both the testing described in this article and data quality at the source described in the article, Data Sources for Data Warehousing, to launch a successful data warehousing / business intelligence effort.
newer post

FUNDAMENTALS OF DATA WAREHOUSE TESTING

3 comments
Description
This course introduces the student to the phases of testing and validation in a data warehouse or other decision support systems project.  Students will learn the role of the testing process as part of a software development project, see how business requirements become the foundation for testing cases and test plans, develop a testing strategy develop audience profiles and learn about how to develop and execute effective tests, all as part of a data warehouse / decision support initiative.  Students will be able to apply the data warehouse concepts in a series of related exercises that enable them to create and refine the various artifacts of testing for their data warehouse programs.

What Makes This Certified Course Unique
This ICCP-certified course provides participants with practical, in-depth understanding of how to create accurate data models for complex Business Intelligence solutions. Hands-on workshops throughout the course will reinforce the learning experience and provide the attendees with concrete results that can be utilized in their organizations.


Course Objectives:
Upon completion of this course, students will be able to:

Review the fundamental concepts of data warehousing and its place in an information management environment
Learn about the role of the testing process as part of software development and as part of data warehouse development
Learn about test strategies, test plans and test cases – what they are and how to develop them, specifically for data warehouses and decision support systems
Create effective test cases and scenarios based on business and user requirements for the data warehouse
Plan and coordinate usability testing for data warehousing
Conduct reviews and inspections for validation and verification
Participate in the change management process and document relevant changes to decision support requirements
Prerequisites:

Experience as a test analyst, business analyst or experience in the testing process
Audience:

Testing analysts, business analysts, project managers, business staff members who will participate in the testing function; data warehouse architects, data analysts
Course Topics:
Understanding Business Intelligence

Analyze the current state of the data warehousing industry
Data warehousing fundamentals
Operational data store (ODS) concepts
Data mart fundamentals
Defining meta data and its critical role in data warehousing and testing
Key Principles in Testing

Introduction
Testing concepts
Overview of the testing and quality assurance phases
Project Management Overview

Basic project management concepts
Project management in software development and data warehousing
Testing and quality assurance as part of software project management
Requirements Definition for Data Warehouses

Requirements management workflow
Characteristics of good requirements for decision support systems
Requirements-based testing concepts and techniques
Audiences in Testing

Audiences and their profiles
User profiles
Customer profiles
Functional profiles
Testing strategies by audience
Test management overview
Risk Analysis and Testing

Risk analysis overview for testing
Test Methods and Testing Levels

Static vs. dynamic tests
Black, grey and white box testing
Prioritizing testing activities
Testing from unit to user acceptance
Test Plans and Procedures

Writing and managing test plans and procedures
Test plan structure and test design specifications
Test Cases Overview

Test case components
Designing test scenarios for data warehouse usage
Creating and executing test cases from scenarios
Validation and Verification

Validating customer needs for decision support
Tools and techniques for validation, verification and assessment
Acceptance Testing for Data Warehouses

Ways to capture informal and formal user issues and concerns
Test readiness review
Iterative testing for data warehouse projects
Reviews and Walk-throughs

Reviews versus walkthroughs
Inspections in testing and quality assurance
Testing Traceability

Linking tests to requirements with a traceability matrix
Change management in decision support systems and testing
To learn more about how *EWSolutions* can provide our World-Class Training for your company or to request a quote, please feel free to contact David Marco, our Director of Education at "DMarco@EWSolutions.com":mailto:DMarco@EWSolutions.com or call him at 630.920.0005 ext. 103.

Test Execution and Documentation

Managing the testing and quality assurance process
Documentation for the testing process
Conclusion
Summary, advanced exercises, resources for further study

SOURCE:http://www.ewsolutions.com/education/data-warehouse-training/document.2008-09-28.9223227179
newer post
newer post older post Home