Monday, May 9, 2011

ETL (Extract-Transform-Load) for Data Warehousing

1 comments

Stocking the data warehouse with data is often the most time consuming task needed to make data warehousing and business intelligence a success. In the overall scheme of things Extract-Transform-Load (ETL) often requires about 70 percent of the total effort.

Extracting data for the data warehouse includes:

    Making ETL Architecture Choices
    Data Mapping
    Extracting data to staging area
    Applying data cleansing transformations
    Applying data consistency transformations
    Loading data

  

Before starting the ETL step for the data warehousing and business intelligence project it is important to determine the business requirements. See the articleRequirements for Data Warehousing and Business Intelligence for more information.

Also, the data sources and targets must be defined. See articles Data Sources for Data Warehousing and Business Intelligence and Data Models for Data Warehousing and Business Intelligence to understand this.


Making ETL Architecture Choices for the Data Warehouse

ETL has a prominent place in data warehousing and business intelligence architecture.
Data Warehousing Architecture
The extract, transformation and loading process includes a number of steps:
ETL Data Warehousing Processes
Create your own diagrams that show the planned ETL architecture and the flow of data from source to target.
Selecting the right ETL Tools is critical to the success the data warehousing and business intelligence project. Should your company acquire a top of the line specialized ETL tool suite, use lower cost Open Source ETL, or use "Tools at Hand"? The article ETL Tool Selection for the Data Warehouse describes these options along with their pros and cons.
Consider these performance improvement methods:
  • Turn off database logging to avoid the overhead of log insertions
  • Load using a bulk load utility which does not log
  • Primary keys should be single integers 
  • Drop relational integrity (RI) / foreign keys - restore after load is complete
  • Drop indexes and re-build after load
  • Partition data leaving data loaded earlier unchanged
  • Load changed data only  - use "delta" processing
  • Avoid SQL Update with logging overhead - possibly drop rows and reload using bulk loader
  • Do a small number of updates with SQL Update, then use bulk load for inserts
  • Use Cyclic Redundancy Checksum (CRC) to detect changes in data rather than brute force method of comparing each column
  • Divide SQL Updates into groups to avoid a big rollback log being create.
  • Use an ETL tool that supports parallelism
  • Use an ETL tool that supports caching
  • Use RAID technologies
  • Use fast disk and controllers - 15,000 RPM
  • Dedicate servers and disk to business intelligence - do not share with other applications 
  • Use multiple servers to support BI such as: a database server, an analysis server and a reporting server
  • Use a server with large main memory (16 GB +) - this increases data caching and reduces physical data access 
  • Use a server with multiple processors / cores to enable greater parallelism

Data Mapping for Data Warehousing and Business Intelligence

A Data Map is specification that identifies data sources and targets as well as the mapping between them. The Data Map specification is created and reviewed with input by business Subject Material Experts (SMEs) who understand the data.
There are two levels of mapping, entity level and attribute level. Each target entity (table) will have a high level mapping description and will be supported by a detailed attribute level mapping specification.
Target Table Namedw_customer
Target Table DescriptionHigh level information about a customer such as name, customer type and customer status.
Source Table Namesdwprod1.dwstage.crm_cust
dwprod1.dwstage.ord_cust
Join Rulescrm_cust.custid = ord_cust.cust.cust_nbr
Filter Criteriacrm_cust.cust_type not = 7
Additional LogicN/A
Then for each attribute the attribute level data map specifies:
  • Source: table name, column name, datatype
  • Target: table name, column name, datatype
  • Transformation Rule
  • Notes
Attribute Level Data Map for Data Warehousing
Transformations may include:
  • Aggregate
  • Substring
  • Concatenate
  • Breakout Array Values / Buckets
newer post

DATA WAREHOUSE TESTING IS DIFFERENT

1 comments
All works in Data Warehouse population are mostly through batch runs. Therefore the testing is different from what is done in transaction systems.

Unlike a typical transaction system, data warehouse testing is different on the following counts:

User-Triggered vs. System triggered

Most of the production/Source system testing is the processing of individual transactions, which are driven by some input from the users (Application Form, Servicing Request.). There are very few test cycles, which cover the system-triggered scenarios (Like billing, Valuation.)

In data Warehouse, most of the testing is system triggered as per the scripts for ETL ('Extraction, Transformation and Loading'), the view refresh scripts etc.

Therefore typically Data-Warehouse testing is divided into two parts--> 'Back-end' testing where the source systems data is compared to the end-result data in Loaded area, and 'Front-end' testing where the user checks the data by comparing their MIS with the data displayed by the end-user tools like OLAP.

Batch vs. online gratification

This is something, which makes it a challenge to retain users interest.

A transaction system will provide instant OR at least overnight gratification to the users, when they enter a transaction, which either is processed online OR maximum via overnight batch. In the case of data- warehouse, most of the action is happening in the back-end and users have to trace the individual transactions to the MIS and views produced by the OLAP tools. This is the same challenge, when you ask users to test the month-end mammoth reports/financial statements churned out by the transaction systems.

Volume of Test Data

The test data in a transaction system is a very small sample of the overall production data. Typically to keep the matters simple, we include as many test cases as are needed to comprehensively include all possible test scenarios, in a limited set of test data..

Data Warehouse has typically large test data as one does try to fill-up maximum possible combination and permutations of dimensions and facts.

For example, if you are testing the location dimension, you would like the location-wise sales revenue report to have some revenue figures for most of the 100 cities and the 44 states. This would mean that you have to have thousands of sales transaction data at sales office level (assuming that sales office is lowest level of granularity for location dimension).

Possible scenarios/ Test Cases

If a transaction system has hundred (say) different scenarios, the valid and possible combination of those scenarios will not be unlimited. However, in case of Data Warehouse, the permutations and combinations one can possibly test is virtually unlimited due to the core objective of Data Warehouse is to allow all possible views of Data. In other words, 'You can never fully test a data Warehouse'

Therefore one has to be creative in designing the test scenarios to gain a high level of confidence.

Test Data Preparation

This is linked to the point of possible test scenarios and volume of data. Given that a data- warehouse needs lots of both, the effort required to prepare the same is much more.

Programming for testing challenge

In case of transaction systems, users/business analysts typically test the output of the system. However, in case of data warehouse, as most of the action is happening at the back-end, most of the 'Data Warehouse data Quality testing' and 'Extraction, Transformation and Loading' testing is done by running separate stand-alone scripts. These scripts compare pre-Transformation to post Transformation (say) comparison of aggregates and throw out the pilferages. Users roles come in play, when their help is needed to analyze the same (if designers OR business analysts are not able to figure it out).
newer post

Various Types of Testimg

0 comments
    XML Testing
    Network Latency Modeling
    Java Testing (J2EE/EJB)        
    Transaction Characterization
    Data Integrity Testing        
    Load/Scalability Testing
    GUI Testing        
    Performance Testing
    Issue/Defect Tracking        
    Stress Testing
    Requirements Management        
    Configuration Testing
    Interoperability Testing   
     Volume Testing
    Functional Testing        
    Concurrency Testing
    Integration Testing        
    Resource Usage Testing
    Web Site Monitoring        
    Infrastructure Testing
    SLA Testing        
    Component Testing
    Security Testing        
    Failover Testing
    Business Rules Testing        
    Reliability Testing
    COM+ Testing              


Testing TypeDescription
XML TestingValidation of XML data content on a transaction-by-transaction basis. Where desirable, validation of formal XML structure (metadata structure) may also be included.
Java Testing (EJB, J2EE)Direct exercise of class methods to validate that both object properties and methods properly reflect and handle data according to business and functional requirements of the layer. Exercise of transactions at this layer may be performed to measure both functional and performance characteristics
Data Integrity TestingValidation of system data at all data capture points in a system, including front-end, middle- or content-tier, and back-end database. Data integrity testing includes strategies to examine and validate data at all critical component boundaries.
GUI TestingValidation of GUI characteristics against GUI requirements.
Issue/Defect TrackingTracking software issues and defects is at the core of the software quality management process. Software quality can be assessed at any point in the development process by tracking numbers of defects and defect criticality. Software readiness-for-deployment can be analyzed by following defect trends for the duration of the project.
Requirements ManagementRequirements both define the shape of software (look-and-feel, functionality, business rules) and set a baseline for testing. As such, requirements management, or the orderly process of gathering requirements and keeping requirements documentation updated on a release-by- release basis, is critical to the deployment of quality software.
Interoperability TestingValidation that applications in a given platform configuration do not conflict, causing loss of functionality.
Functional TestingValidation of business requirements, GUI requirements and data handling in an application.
Security TestingValidation that security requirements of a system have been correctly implemented, including: resistance to password cracking, Denial of Service (DOS) attacks, and that known security flaws have been properly patched.
Business Rules TestingValidation that business rules have been properly implemented in a system, enforcing correct business practices on the user.
COM+ TestingDirect exercise of COM methods to validate that both object properties and methods properly reflect and handle data according to business and functional requirements of the COM layer. Exercise of transactions at this layer may be performed to measure both functional and performance characteristics.
Integration TestingTesting in which software components, hardware components, or both are combined and tested to evaluate the interaction between them.
Network Latency ModelingAnalysis of the fundamental amount of time it takes a given message to traverse a given distance across a specific network. This factor influences all messages that traverse a network, and is key in modeling network behavior.
Transaction CharacterizationDetermining the footprint of business transactions. This includes bandwidth on the network, CPU and memory utilization on back-end systems. Additionally used in Network Latency Modeling and Resource Usage Testing.
Load/Scalability TestingIncrease load on the target environment until requirements are exceeded or saturation of a resource. This is usually combined with other test types to optimize performance.
Performance TestingDetermining if the test environment meets requirements at set loads and mixes of transactions by testing specific business scenarios.
Stress TestingExercising the target system or environment at the point of saturation (depletion of a resource: CPU, memory, etc.) to determine if the behavior changes and possibly becomes detrimental to the system, application or data.
Configuration TestingEncompasses testing various system configurations to assess the requirements and resources needed.
Volume TestingDetermining the volume of transactions that a complete system can process. Volume Testing is conducted in conjunction with Component, Configuration and/or Stress Testing.
Resource Usage TestingMulti-user testing conducted beyond Transaction Characterization to determine the total resource usage of applications and subsystems or modules.
Concurrency TestingMulti-user testing geared towards determining the effects of accessing the same application code, module or database records. Identifies and measures the level of locking, deadlocking and use of single-threaded code and locking semaphores.
Infrastructure TestingVerifying and quantifying the flow of data through the environment infrastructure.
Component TestingThe appropriate tests are conducted against the components individually to verify that each individual component can support without failure. This testing is typically conducted while the environment is being assembled to identify any weak links.
Failover TestingIn environments that employ redundancy and load balancing, Failover Testing analyzes the theoretical failover procedure, tests and measures the overall failover process and its effects on the end-user.
Reliability TestingOnce the environment or application is working and optimized for performance, a longer period (24 to 48 hour) Reliability Test will determine if there are any long term detrimental issues that may effect performance in production.
SLA TestingSpecialized business transaction testing to measure Service Level Agreements with third party vendors. The typical agreement guarantees a specified volume of activity over a predetermined time period with a specified maximum response time.
Web Site MonitoringMonitoring business transaction response times after production deployment to ensure end-user satisfaction.
 

newer post

Testing the Data Warehouse

1 comments
Testing the data warehouse and business intelligence system is critical to success.  Without testing, the data warehouse could produce incorrect answers and quickly lose the faith of the business intelligence users. Effective testing requires putting together the right processes, people and technology and deploying them in productive ways.

Data Warehouse Testing Responsibilities

Who should be involved with testing?  The right team is essential to success:

Business Analysts gather and document requirements
QA Testers develop and execute test plans and test scripts
Infrastructure people set up test environments
Developers perform unit tests of their deliverables
DBAs test for performance and stress
Business Users perform functional tests including User Acceptance Tests (UAT)
Business Requirements and Testing

When should your project begin to think about testing?  The answer is simple - at the beginning of the project.  Successful testing begins with the gathering and documentation of requirements.  Without requirements there is no measure of system correctness.

Expect to produce a Requirements Traceability Matrix (RTM) that cross references data warehouse and business intelligence features to business requirements.  The RTM is a primary input to the Test Plan.

Data Warehousing Test Plan

The Test Plan, typically prepared by the QA Testers, describes the tests that must be performed to validate the data warehousing and business intelligence system.  It describes the types of tests and the coverage of required system features.

Test Cases are details that enable implementation of the Test Plan.  The Test Case itemizes steps that must be taken to test the system along with expect results.  A Text Execution Log tracks each test along with the results (pass or fail) of each test item.

Testing Environments and Infrastructure

Multiple environments must typically be created and maintained to support the system during its lifecycle:

Development
QA
Staging / Performance
Production
These kinds of tools can facilitate testing and problem correction:

Automated test tool
Test data generator
Test data masker
Defect manager
Automated test scripts
Unit Testing for the Data Warehouse

Developers perform tests on their deliverables during and after their development process.  The unit test is performed on individual components and is based on the developer's knowledge of what should be developed.

Unit testing should definitely be performed before deliverables are turned over to QA by developers.  Tested components are likely to have fewer bugs.

QA Testers Perform Many Types of Tests

QA Testers design and execute a number of tests:

Integration Test   
Test the systems operation from beginning to end, focusing on how data flows through the system.  This is sometimes called "system testing" or "end-to-end testing".

Regression Test    Validate that the system continues to function correctly after being changed.  Avoid "breaking" the system.


Can the Data Warehouse Perform?

Tests can be designed and executed that show how well the system performs with heavy loads of data:

Extract Performance Test

Test the performance of the system when extracting a large amount of data.

Transform and Load Performance Test   
Test the performance of the system when transforming and loading a large amount of data.  Testing with a high volume is sometimes called a "stress test".

Analytics Performance Test    Test the performance of the system when manipulating the data through calculations.

Business Users Test Business Intelligence

Does the system produce the results desired by business users?  The main concern is functionality, so business users perform functional tests to make sure that the system meets business requirements.  The testing is performed through the user interface (UI) which includes data exploration and reporting.

Correctness Test   
The system must be produce correct results.  The measures and supporting context need to match numbers in other systems and be calculated correctly.

Usability Test    The system should be as easy to use as possible.  It involves a controlled experiment about how business users can use the business intelligence system to reach stated goals.
Performance Test   
The system must be able to return results quickly without bogging down other resources.



Business Intelligence Must Be Believed

Quality must be baked into the data warehouse or users will quickly lose faith in the business intelligence produced.  It then becomes very difficult to get people back on board.

Putting the quality in requires both the testing described in this article and data quality at the source described in the article, Data Sources for Data Warehousing, to launch a successful data warehousing / business intelligence effort.
newer post

FUNDAMENTALS OF DATA WAREHOUSE TESTING

3 comments
Description
This course introduces the student to the phases of testing and validation in a data warehouse or other decision support systems project.  Students will learn the role of the testing process as part of a software development project, see how business requirements become the foundation for testing cases and test plans, develop a testing strategy develop audience profiles and learn about how to develop and execute effective tests, all as part of a data warehouse / decision support initiative.  Students will be able to apply the data warehouse concepts in a series of related exercises that enable them to create and refine the various artifacts of testing for their data warehouse programs.

What Makes This Certified Course Unique
This ICCP-certified course provides participants with practical, in-depth understanding of how to create accurate data models for complex Business Intelligence solutions. Hands-on workshops throughout the course will reinforce the learning experience and provide the attendees with concrete results that can be utilized in their organizations.


Course Objectives:
Upon completion of this course, students will be able to:

Review the fundamental concepts of data warehousing and its place in an information management environment
Learn about the role of the testing process as part of software development and as part of data warehouse development
Learn about test strategies, test plans and test cases – what they are and how to develop them, specifically for data warehouses and decision support systems
Create effective test cases and scenarios based on business and user requirements for the data warehouse
Plan and coordinate usability testing for data warehousing
Conduct reviews and inspections for validation and verification
Participate in the change management process and document relevant changes to decision support requirements
Prerequisites:

Experience as a test analyst, business analyst or experience in the testing process
Audience:

Testing analysts, business analysts, project managers, business staff members who will participate in the testing function; data warehouse architects, data analysts
Course Topics:
Understanding Business Intelligence

Analyze the current state of the data warehousing industry
Data warehousing fundamentals
Operational data store (ODS) concepts
Data mart fundamentals
Defining meta data and its critical role in data warehousing and testing
Key Principles in Testing

Introduction
Testing concepts
Overview of the testing and quality assurance phases
Project Management Overview

Basic project management concepts
Project management in software development and data warehousing
Testing and quality assurance as part of software project management
Requirements Definition for Data Warehouses

Requirements management workflow
Characteristics of good requirements for decision support systems
Requirements-based testing concepts and techniques
Audiences in Testing

Audiences and their profiles
User profiles
Customer profiles
Functional profiles
Testing strategies by audience
Test management overview
Risk Analysis and Testing

Risk analysis overview for testing
Test Methods and Testing Levels

Static vs. dynamic tests
Black, grey and white box testing
Prioritizing testing activities
Testing from unit to user acceptance
Test Plans and Procedures

Writing and managing test plans and procedures
Test plan structure and test design specifications
Test Cases Overview

Test case components
Designing test scenarios for data warehouse usage
Creating and executing test cases from scenarios
Validation and Verification

Validating customer needs for decision support
Tools and techniques for validation, verification and assessment
Acceptance Testing for Data Warehouses

Ways to capture informal and formal user issues and concerns
Test readiness review
Iterative testing for data warehouse projects
Reviews and Walk-throughs

Reviews versus walkthroughs
Inspections in testing and quality assurance
Testing Traceability

Linking tests to requirements with a traceability matrix
Change management in decision support systems and testing
To learn more about how *EWSolutions* can provide our World-Class Training for your company or to request a quote, please feel free to contact David Marco, our Director of Education at "DMarco@EWSolutions.com":mailto:DMarco@EWSolutions.com or call him at 630.920.0005 ext. 103.

Test Execution and Documentation

Managing the testing and quality assurance process
Documentation for the testing process
Conclusion
Summary, advanced exercises, resources for further study

SOURCE:http://www.ewsolutions.com/education/data-warehouse-training/document.2008-09-28.9223227179
newer post

Monday, April 25, 2011

Data Warehousing ETL tutorial

1 comments

The ETL and Data Warehousing tutorial is organized into lessons representing various business intelligence scenarios, each of which describes a typical data warehousing challenge.
This guide might be considered as an ETL process and Data Warehousing knowledge base with a series of examples illustrating how to manage and implement the ETL process in a data warehouse environment.

The purpose of this tutorial is to outline and analyze the most widely encountered real life datawarehousing problems and challenges that need to be taken during the design and architecture phases of a successful data warehouse project deployment.

Going through the sample implementations of the business scenarios is also a good way to compare Business Intelligence and ETL tools and get to know the different approaches to designing the data integration process. This also gives an idea and helps identify strong and weak points of various ETL and data warehousing applications.

This tutorial shows how to use the following BI, ETL and datawarehousing tools: Datastage, SAS, Pentaho, Cognos and Teradata.
Data Warehousing & ETL Tutorial lessons

    Surrogate key generation example which includes information on business keys and surrogate keys and shows how to design an ETL process to manage surrogate keys in a data warehouse environment. Sample design in Pentaho Data Integration
    Header and trailer processing - considerations on processing files arranged in blocks consisting of a header record, body items and a trailer. This type of files usually come from mainframes, also it applies to EDI and EPIC files. Solution examples in Datastage, SAS and Pentaho Data Integration
    Loading customers - a data extract is placed on an FTP server. It is copied to an ETL server and loaded into the data warehouse. Sample loading in Teradata MultiLoad
    Data allocation ETL process case study for allocating data. Examples in Pentaho Data Integration and Cognos PowerPlay
    Data masking and scambling algorithms and ETL deployments. Sample Kettle implementation
    Site traffic analysis - a guide to creating a data warehouse with data marts for website traffic analysis and reporting. Sample design in Pentaho Kettle
    Data Quality - ETL process design aimed to test and cleanse data in a Data Warehouse. Sample outline in PDI
    XML ETL processing
newer post

What is Business Intelligence?

0 comments

Business intelligence is a broad set of applications, technologies and knowledge for gathering and analyzing data for the purpose of helping users make better business decisions.
The main challenge of Business Intelligence is to gather and serve organized information regarding all relevant factors that drive the business and enable end-users to access that knowledge easily and efficiently and in effect maximize the success of an organization.
Business intelligence produces analysis and provides in depth knowledge about performance indicators such as company's customers, competitors, business counterparts, economic environment and internal operations to help making effective and good quality business decisions.

From a technical standpoint, the most important areas that Business Intelligence (BI) covers are:
DW - Data warehousing - architecture, modeling, managing, processing
ETL process and data integration
Reporting, Information visualization and Dashboards
OLAP - Online Analytical Processing and multidimensional analysis
Data cleansing and data quality management
Performance management
Data mining, statistical analysis, forecasting
MIS - Management Information Systems
CRM - Customer Relationship Management

Etl Tools Info portal

ETL-Tools.Info portal provides information about different business intelligence tools and datawarehousing solutions, with a main focus on ETL process and tools. On our pages you will find both general articles with high-level information on various Business Intelligence applications and architectures, as well as technical documents, with a low-level description of the presented solutions and detailed tutorials.
A great attention is paid to the Datastage ETL tool and we provide a number of Datastage examples, Datastage tutorials, best practices and resolved problems with real-life examples.
There is also a wide range of information on a rapidly growing Open Source Business Intelligence market (OSBI), with emphasis on applications from the Pentaho BI family, including a Pentaho tutorial.
We also provide a SAS Guide with tutorial, which illustrates the vision of SAS on Business Intelligence, Data Warehousing and ETL process.
We have recently added the ETL case study (ETL and data warehousing course) section which represents a set of business cases, each of which illustrates a typical data warehousing problem followed by sample implementations. We analyze the cases thoroughly and propose the most efficient and appropriate approach to solving that problems by showing sample ETL process designs and DW architectures.
Microsoft users may be very interested in exploring our Excel BI crosstabs section with FAQ and sample solutions.

newer post

Buy vs. Build

0 comments

Buy vs. Build

When it comes to ETL tool selection, it is not always necessary to purchase a third-party tool. This determination largely depends on three things:

    Complexity of the data transformation: The more complex the data transformation is, the more suitable it is to purchase an ETL tool.
    Data cleansing needs: Does the data need to go through a thorough cleansing exercise before it is suitable to be stored in the data warehouse? If so, it is best to purchase a tool with strong data cleansing functionalities. Otherwise, it may be sufficient to simply build the ETL routine from scratch.
    Data volume. Available commercial tools typically have features that can speed up data movement. Therefore, buying a commercial product is a better approach if the volume of data transferred is large.

ETL Tool Functionalities

While the selection of a database and a hardware platform is a must, the selection of an ETL tool is highly recommended, but it's not a must. When you evaluate ETL tools, it pays to look for the following characteristics:

Functional capability: This includes both the 'transformation' piece and the 'cleansing' piece. In general, the typical ETL tools are either geared towards having strong transformation capabilities or having strong cleansing capabilities, but they are seldom very strong in both. As a result, if you know your data is going to be dirty coming in, make sure your ETL tool has strong cleansing capabilities. If you know there are going to be a lot of different data transformations, it then makes sense to pick a tool that is strong in transformation.

Ability to read directly from your data source: For each organization, there is a different set of data sources. Make sure the ETL tool you select can connect directly to your source data.

Metadata support: The ETL tool plays a key role in your metadata because it maps the source data to the destination, which is an important piece of the metadata. In fact, some organizations have come to rely on the documentation of their ETL tool as their metadata source. As a result, it is very important to select an ETL tool that works with your overall metadata strategy.

Popular Tools

    IBM WebSphere Information Integration (Ascential DataStage)
    Ab Initio
    Informatica
    Talend
newer post

Monday, April 18, 2011

Information Security Services

0 comments
Information security is the most vital business enabler for all corporate organizations. Challenges faced in protecting critical assets like data and information need a solid and robust security framework that can detect, prevent and blocks malicious attempts to steal or misuse these vital resources.

Telkite's Information Security services, aimed at helping companies achieve continuous real time network monitoring and proactive security intelligence to ensure safety, availability and reliability of information can be classified into the following broad spectrum.


Professional Consulting
Security Governance and Compliance
Telkite's umbrella services in Security Governance, Risk and Compliance helps organizations evaluate operational controls and ensure that controls and processes achieve Governance and compliance to required norms.

Network and System Security
Telkite has a comprehensive suite of services for network and system security assessment that ensure information processed by a set of network devices, servers and systems are configured to stringent protocols that eliminate security risks.

Identity and Access Management
IAM (Identity and Access Management) protects unauthorized access to vital information by implementing best of breed access management systems that ensures only authorized people get access based on roles and needs.

Information Technology and Systems Audit
Information Technology and Systems Audit helps organizations highlight business risks due to inadequacy of controls for live systems and the infrastructure supporting them. Telkite's auditing services are classified into Application Audits and Technical Audits.

Application Audits
subject applications in production to verification of controls from the business process perspective which may be administrative controls or controls embedded in the application. Auditors will evaluate the complete logical security related to authentication, authorization and the configurations around that.

Technical Audits
identify vulnerabilities with respect to operating systems, network devices and public websites and all IT operational controls are also assessed from the control risk perspective. Technical Audits include areas like Data Centre, BCP & Testing, Risk Assessment, Network Architecture and Components, Email Services, Active Directory Services, IT General Controls and Information Security Policy Compliance

Business Continuity and Disaster Recovery
Organizations rely heavily on infrastructure items for information processing and it is vital for infrastructure systems to have continuous uptime and be robust enough to handle contingencies. Telkite assess organizations from the Business Continuity Perspective and recommends the recovery strategy and the plan of action for implementation.

Application Security
Telkite helps organizations assess application security requirements with respect to business and development processes. Application audit also forms the part of Application Security.

Managed Services
Vulnerability Management
Vulnerability Management services identifies susceptibilities for critical servers and network devices and recommends best practices for handling the vulnerabilities. Penetration testing is done on servers and network components exposed to the internet as also wireless access points.

Log management & analysis
Log management and analysis involves analysis of security logs and action on exceptions.

Manage security operations
Security operation services include antivirus management, patch management, tool maintenance and license management.

Security Device Management
Security device management involves managing the network perimeter, intranet, extranet, servers, desktops and laptops.

SOURCE:http://www.telkite.com/informservice1.html
newer post

Middleware & SOA Services

0 comments
Telkite SOA Vision
Telkite aims to seamlessly integrate business process and the underlying applications to facilitate real time information sharing and create a nimble Enterprise which successfully blends IT reuse and agility. This vision entails a 5 stage process beginning with implementation and development of applications to automate homogenous business processes. Stages 1 and 2 are about Integration of heterogeneous business process using SOA and EAI, while stage 3 is creation self improving and continuously optimizing business processes. Stage 4 entails creation of real time information sharing mechanisms using Business intelligence and Business activity monitoring and finally stage 5 is using Web 2.0 to build enterprise solutions.


Telkite Services
The Middleware & SOA practice offers business process integration solutions to large enterprises which are faced with typical integration and reuse challenges in key operational business processes.

At Telkite we have provided services to customers in their journey from mass production to mass customization. The services are classified into 5 key areas of Integration namely – EDI, EAI, BPM, IDM and SOA Driven Integration. Delivery of these services is aided by the focus and direction of the practice towards business process integration.

Services include:
EAI Servcies
EAI Product Evaluation
Integration Design
Implementation
Migration and Upgrades
Support and Maintenance
Testing

BPM Services
Platform Implementation
Process Modelling
Process Optimization
Support and Maintenance
Testing

IDM Services
Single Sign On
Access Management
Provisioning, User life Cycle Management
Personalization
Workflow
Integration with BPM and Middlware

SOA Services
SOA Roadmap and Consulting
ESB Implementation
SOA Testing
Service Creation
Service Extraction
Service Utilization and Management

SOA – Aligning IT with Business
newer post
newer post older post Home