Wednesday, March 7, 2012

Data Warehousing Solutions from SAP

0 comments
With SAP NetWeaver Business Warehouse (SAP NetWeaver BW), you can tightly integrate data warehousing capabilities on a comprehensive and scalable platform, while leveraging best practices to drive business intelligence predicated on a single version of the truth.
An edition of SAP NetWeaver BW powered by SAP HANA is also available to help you supercharge your enterprise data warehouse, simplify your IT landscape, and enable your business users to make decisions faster.


By combining a scalable and layered architecture, a rich set of predefined business content based on established best practices, and key enterprise information management topologies, SAP NetWeaver BW can help you achieve:
  • Reliable data acquisition – Tightly integrate data across all applications and business processes in SAP Business Suite, and enrich accessibility of heterogeneous data sources and data quality.
  • Business-oriented modeling – Enable quick and sustainable implementations through modeling patterns based on established best practices and rich predefined business content that match your business processes. Deploy all models across different subject domains and enable a single version of the truth across your complete data warehouse.
  • Robust analytical capabilities – Support online analytical processing and provide a robust foundation for computing multidimensional business data across dimensions and hierarchies. Benefit from a framework for building planning applications tightly integrated with your enterprise data warehouse.
  • Enterprise-ready life-cycle management – Benefit from sophisticated life-cycle management functionality at three different levels – system life-cycle management, metadata life-cycle management, and information life-cycle management.

SAP NetWeaver Business Warehouse can help you reap significant benefits:
  • Rapid implementation – SAP NetWeaver BW ships with business content, such as ready-made ETL routines, metadata, InfoCubes, information models, and reports to support reporting and analysis right out of the box.
  • Robust scalability and performance – Ensure faster decision making with accelerated data loads and rapid delivery of queries using SAP NetWeaver BW Accelerator.
  • Efficient development and reduced TCO – Automated data flow design and graphic data modeling tools speed development, enabling you to reduce the total cost of ownership for managing business data.
  • Streamlined operations – Manage and monitor operations with functionality to actively push critical events – and actionable recommendations for recovery and self-healing – to administrators. Ensure compliance with corporate policies, while ensuring high data manageability and consistency.

newer post

Informatica Strikes a Big Data Partnership

0 comments
Informatica this week inscribed another notch in its Big Data belt by inking a partnership agreement with MapR, one of the leading Hadoop distributions in the marketplace. The partnership further opens Hadoop to the sizable market of Informatica developers and provides a visual development environment for creating and running MapReduce jobs.
The partnership is fairly standard by Hadoop terms. Informatica can connect to MapR via PowerExchange and apply PowerCenter functions to the extracted data, such as data quality rules, profiling functions, and transformations. Informatica also provides HParser, a visual development environment for parsing and transforming Hadoop data, such as logs, call detail records, and JSON documents. Informatica has already signed similar agreements with Cloudera and HortonWorks.
Deeper Integration. But Informatica and MapR have gone two steps beyond the norm. Because MapR's unique architecture bundles an alternate file system (Network File System) behind industry standard Hadoop interfaces, Informatica has integrated two additional products with MapR: Ultra Messaging and Fast Clone. Ultra Messaging enables Informatica customers to stream data into MapR, while Fast Clone enables them replicate data in bulk. In addition, MapR will bundle the community edition of Informatica's HParser, the first Hadoop distribution to do so.
The upshot is that Informatica developers can now leverage a good portion of Informatica's data integration platform with MapR's distribution of Hadoop. Informatica is expected to announce the integration of additional Informatica products with MapR later this spring.
The two companies are currently certifying the integration work, which be finalized by end of Q1, 2012.
newer post

COMPARATIVE ARCHITECTURES SEMINAR

0 comments
For years there have been two leading approaches to building the data warehouse environment – DW 2.0 “hub and spoke” architecture and the dimensional model/bus architecture. Both architectures have their advantages and disadvantages. There are places where one approach fits best.
Come hear Bill Inmon discuss the DW 2.0 approach and hear Scott Hirst discuss the dimensional modeling/bus architecture approach.
Then, hear how the two architectures can be blended, yielding a “best of all worlds” approach.
Bill Inmon is well known for articulating the hub and spoke, top down approach to building the data warehouse. Scott Hirst is an experienced practitioner who has employed the dimensional modeling/bus architecture approach in building numerous data warehouses. Equal time will be given to both approaches. Finally, an architecture that blends the best of DW 2.0 with the best of dimensional modeling is discussed.
Here are some topics that will be covered in the seminar:
  • What is DW 2.0? Dimensional modeling/bus architecture?
  • Dimensional modeling and building reports quickly
  • DW 2.0 and the lifecycle of data within the data warehouse
  • Dimensional modeling – a case study
  • DW 2.0 – building the unstructured data warehouse

HOW THIS SEMINAR IS DIFFERENT

One-On-One Sessions
Our speakers are offering free, one-on-one personal sessions where you can sit and discuss your architecture, database design approach, your issues, your challenges. When our speakers are not presenting, they will be available for private one-on-one sessions of approximately one hour each. No other seminar offers you the possibility of having private sessions with the industry thought leaders. Take advantage of it.

Colorado Skiing
Skiing in the Rocky Mountains is an unforgettable experience. With ski resorts open into May and sometime June, we will surely have great skiing and snowboarding weather in early April and most likely plenty of brilliant sunshine. Denver averages 300+ days of sunshine a year. Some people play golf in Denver, then go skiing in the mountains. Where else can you do that in one day!
newer post

TDWI's Best of Business Intelligence Volume 9

0 comments
Welcome to the ninth annual TDWI’s Best of Business Intelligence: A Year in Review. Each year we select a few of TDWI’s best, most well-received, hard-hitting articles, research, and information, and present them to you in this publication.
Stephen Swoyer kicks off this issue with a review of major business intelligence (BI) developments. In “2011 in Review: From Tablets to Takeovers,” he names social BI and the success of tablets and mobile devices as some of 2011’s trends. Swoyer also calls 2011 “the year in which social media emerged as one of several forces … that will fundamentally transform BI as we know it.”
In “2012 Forecast: The Evolution of Big Data Analytics and the Future of BI,” TDWI Research analysts Philip Russom and David Stodder share their predictions for the coming year. Russom offers insight on how the evolution of big data analytics will impact business intelligence and data warehousing professionals, and Stodder shares five trends he sees shaping the future of BI.
Other selections in this year's issue of the Best of Business Intelligence include:
  • Excerpts from the past year’s Best Practices Reports
  • Ten Mistakes to Avoid When Setting Your Cloud Business Intelligence Strategy
  • Articles from TDWI FlashPoint, BI This Week, and TDWI Experts about BI life cycle management, agile requirements gathering, mobile BI, and working with data analysts
  • Two Business Intelligence Journal articles: “How Gamification Will Change Business Intelligence” and “BI Experts’ Perspective: Integrating Structured and Unstructured Data"
newer post

Repository query to fetch User groups and users in it.

0 comments
Can somebody give me PowerCenter Repository query(Oracle/SQLServer) to fetch below;
1. Fetch a specific User group and its members. Eg: Group Grp1 has members usr1, usr2, usr3, usr4, and usr5.
2. Fetch a specific Relational connection name and who has what kind of access. Eg: Users usr1,usr2, and usr3 has R/W/X permission on Rel_Conn1. Or Groups grp1, grp2, and grp3 has R/X permission on Rel_Conn1

I tried OPB_USER_GROUP table, but couldn't find relevant information.


Repository query to fetch User groups and users in it.

hope this works



--Find the group id

select

OPB_GROUPS



--find the users in the group



select * from

OPB_USERS

where user_id in(select user_id from OPB_USER_GROUPS where GROUP_ID=<GRP_ID>)

* from INFAPCREPO8.
newer post

Tuesday, March 6, 2012

FEATURED: Informatica 9 - How does it change the dynamics of IT Projects?

0 comments
In my opinion, with the launch of Informatica 9, it looks like the role of the developers is brought down considerably as the Business users themselves are able to customize the data objects to their need. The development effort is also coming down, as the Logical Data Objects created can be deployed across any service like Virtual SQL Federator Service or a Web service, etc. Flexibility is alright, but what is the cost involved in training the Business users to customize the delivered objects? What is the change that is brought about in Informatica's flagship product, PowerCenter? Is version 9 mainly focused on the requirements of "Executive Dashboard"? If so, does it have any impact on other kinds of projects like Data Migration, the focus of which is not related to any reporting? Do you really think Informatica 9 can reduce the IT project costs by reducing the effort? Finally, what do you think about Informatica 9?


---
This discussion is part of Toolbox for IT's new Featured Discussion program, which offers open-ended discussion group conversations that enable participants to engage around issues that are slightly broader than traditional group message topics. If you're interested in submitting a topic for a future Featured Discussion, or if you would like to lead a future Featured Discussion, please contact moderator at ittoolbox dot com.


This product is still in Beta Stage and many of their (Informatica) clients haven't used this version yet. Having said that my observation on this version 9 is as follows.

Business/IT Collaboration:

Requirement - Design - Development can be done thru informatica version 9 suite that contains Informatica Analyst and Developer.
Informatica Analyst would replace our existing word/pdf documents, excel spreadsheet and any customized tools which are used widely by our business users.

Even though during the launch presentation they have shown how easy the tool is but in reality, most of the business users will hesitate to move towards using informatica analyst.

Consider the current scenario. Business have an requirement - BA's analyze those - create specifications - Developers coordinate with business analyst and they develop the application and roll out. We have 3 different groups involved in this process and each of them have their unique roles and responsibilities. Apart from these people we have infrastructure folks who help us to achieve our requirement.

In the version 9 scenario, Business user can create their own requirement using Informatica Analyst Tool and developers can develop those using Developer Tool. This would eliminate the involvement for business analyst role. This would be impossible to achieve because not all business users can act as an business analyst.

If you want to launch this product in any financial institutions most of the employees will resist because this would encourage employer to lay off most of their BA's.

Pervasive Data Quality:

In reality we can't achieve this 100 % data quality. The reason is very simple 'Too many cooks spoil the broth'. Especially for a datawarehouse you will be receiving data from many internal/external systems it's very very difficult to achieve 100 % quality because there are too many people,process and apps involved in this process.

SOA-Based Data Services :

This is a good feature where by we can publish our data as a web service.

I don't know about the impact of this release on power center. Version 9 launch does not speak about existing informatica products. 

I think you are reading too much into it. Since the beginning of time (or at least since PC's became prevalent in the business environment) vendors have cooked up tools that 'empower the business and circumvent traditional IT approaches'. There is nothing wrong with that, in fact, I'm all for it.

However, the reality is, these tools are only as easy and productive as the technical environment behind it. The information environment in any business is very complex and requires architecture, planning, design and delivery to make sense of it. This requires strong technical skills as well as an understanding of the business.

Business people don't want to be technicians... they don't want to be spending their time burried in a tool defining and testing rules. What something like INFA 9 does is blur the boundaries between IT and the Business. This is good as there shouldn't have been boundaries in the first place (that is why there are so many screwed up systems out there).

What is does mean is that IT and Business need to, and should, work closer together. One is not replacing the other.
newer post

New Features of Informatica 9.0

0 comments
Following are some new features introduced in Informatica 9.0 :

1.  New Client tools:

Informatica 9 includes the Informatica Developer and Informatica Analyst client tools.


The Informatica Developer tool is eclipse-based and supports both data integration and data quality for enhanced productivity. The Informatica Analyst tool is a browser-based tool for analysts, stewards and line of business managers.  This tool supports data profiling, specifying and validating rules (Scorecards), and monitoring data quality.


2. Informatica Administrator:

The PowerCenter Administration Console has been renamed the Informatica Administrator.


The Informatica Administrator is now a core service in the Informatica Domain that is used to configure and manage all Informatica Services, Security and other domain objects (such as connections) used by the new services.
The number of objects stored in the domain has increased to accommodate these new requirements.

3.  Session Log size:

You can limit the size of session logs for real-time sessions. You can limit the size by time or by file size. You can also limit the number of log files for a session.

4. Lookup Transformation:

  > Cache updates


     You can update the lookup cache based on the results of an expression. When an expression is true, you can add to or update the lookup cache. You can update the dynamic lookup cache with the results of an expression.

  > Database deadlock resilience
      
      In previous releases, when the Integration Service encountered a database deadlock during a lookup, the session failed. Effective in 9.0, the session will not fail. When a deadlock occurs, the Integration Service attempts to run the last statement in a lookup. You can configure the number of retry attempts and time period between attempts.

  > Multiple rows return.

     You can configure the Lookup transformation to return all rows that match a lookup condition. A Lookup transformation is an active transformation when it can return more than one row for any given input row.


 > SQL overrides for uncached lookups

   In previous versions you could create a SQL override for cached lookups only. You can create an SQL override for uncached lookup. You can include lookup ports in the SQL query.
newer post

Breaking News—Informatica Launches Version 9 of its Data Integration Platform

0 comments
Informatica today announced version 9 of its data integration platform with the theme of "Enabling the Data-Driven Enterprise." The Informatica platform is a comprehensive offering that provides for enterprise data integration, cloud data integration, B2B data exchange, and data quality across the whole enterprise. Major new features in version 9 of the platform focus on the areas of SOA-based data services, pervasive data quality, and Business-IT collaboration.
Arvind Parthasarathi, vice president of product management for Informatica, tells 5 Minute Briefing that "the key elements of building a data-driven enterprise are the ability to provide relevant, trustworthy, and timely data to the organization. Informatica 9 enables all three of these through the feature areas of business-IT collaboration, data quality, and multi-modal data provisioning services. Features in version 9 that enable Business-IT collaboration include the ability to provide users with views into their data based on the way they are used to seeing it. This allows business users to participate in the entire process of getting the data they want. Data quality is provided via capabilities such as highly accurate global matching and address cleansing with domain aware pre-built rules and reference data.
Regarding multi-modal data provisioning services, Parthasarathi continued by saying "we provide the right data to users at the right time based on their business needs. The right data means data in its most relevant form for its intended use including various data types and data technologies. The right time can vary according to the amount of data latency any given business process can tolerate and Informatica offers numerous data delivery styles and technologies in a sliding-scale fashion to meet all various needs. This includes batch ETL, real-time change data capture, and federated query capabilities."
According to Informatica, version 9 is the single most important release in the company's history, and is a comprehensive solution for solving the challenges of managing data across the enterprise. With Informatica 9, companies can lower the costs and time to discover data and deliver it the way it is needed. They can also identify the bad data that is impacting business decisions and fix it faster. This is facilitated by Informatica's support for all data domains and all applications across all geographies. Lastly, version 9 enables the control and management of data wherever it is located, whether that be on-premise, in the cloud, with partner networks or any combination of these.
newer post

Deliver Complex Hierarchical Data Natively with Advanced XML Data Integration

0 comments
The Informatica PowerCenter Advanced XML Data Integration Option enables real-time access to hierarchical data otherwise locked in XML files and messages. This option handles all XML data integration challenges, including integration complexity, visibility into hierarchical data, high-performance requirements, and maintenance related to frequent changes. With this XML data integration option, your IT organization can natively access, parse, and create any XML file based on any XSD—including data locked in complex schemas, deep hierarchical structures, and large XML files.

    Easily incorporate complex XML data into reports and analyses to deliver a more comprehensive view of data to the business
    Improve business responsiveness and agility in competitive markets by simplifying the adoption of frequent changes to XML schemas
    Increase IT productivity and reduce costs by natively supporting XML data integration, eliminating the need for hand coding and allowing collaboration between business and IT

Informatica PowerCenter Advanced XML Data Integration Option Key Features

    Native support for all schema features, including complex XSD features such as xsi:type, to streamline XML data integration
    Support for recursive structures and large schemas with deep hierarchies to streamline XML data integration
    Integrated, codeless, visual environment for performing XML data integration with virtually unlimited access to any XML/XSD file
    Simple drag-and-drop Excel interface for business users to easily transform complex hierarchical XML data into database and data warehouse structures
newer post

Monday, March 5, 2012

Structure, Semantics and Master Data Models

0 comments
Looking back at some of my Informatica Perspectives posts over the past year or so, I reflected on some common themes about data management and data governance, especially in the context of master data management and particularly, master data models. As both the tools and the practices around MDM mature, we have seen some disillusionment in attempts to deploy an MDM solution, with our customers noting that they continue to hit bumps in the road in the technical implementation associated with both master data consolidation and then with publication of shared master data.

Almost every issue we see can be characterized into one of three buckets:

1)     Reference data issues, associated with misalignment of commonly-used master data sets, conceptual data sets, values domains, and various mappings across those ideas. For example, one business application defined US States to include only the 50 states and the District of Columbia, while other applications will include Puerto Rico, the Virgin Islands, Guam, and Samoa as values within the US States data domain.

2)     Structure issues, which often relate to differences in the source data models for similar data entity concepts (at the data element, table, and entity relationship levels) as well as differences between the source models, the master data models, and the downstream applications.

3)     Semantic issues, in which isolated “meaning” differences that are specific to an application cause problems in aligning data entity relationships into a common model. As an example, one of our clients is trying to create a master customer database, except that one source is the marketing application, and while the items in that data set are called “customers,” many are actually just “prospects” and have not yet committed to purchasing a product.

These are actually three sides of the same coin (funny looking coin, though ;-) ) that reflect program implementation stalls due to master data modeling issues. It turns out there are two “philosophical” approaches to master data modeling. On the one hand we have the “pre-packaged” approach, in which the MDM vendor provides a set of core data models for each common data entity such as customer, vendor, contract, product, etc. These models are often derived from universal-style, hierarchical/object-oriented models, and are tweaked to fit into the architecture of the deployment site. This approach basically comes bundled with a set of defined services for accessing and updating master data attributes and objects. In the other approach, the characteristics of (and relationships among) master data models are driven by the specific needs of the business (and is consequently referred to as the “business model-driven” approach). In my next post I will provide some additional details about the differences between the approaches. I will also be talking about this on a March 20 TDWI Webinar, “Is Your Approach to Modeling MDM Fixed or Flexible?”
newer post

New Features of Informatica-9

0 comments
1. Informatica 9 supports data integration for the cloud as well as on premise. You can integrate the data in cloud applications, as well as run Informatica 9 on cloud infrastructure.

2. Informatica analyst is a new tool available in Informatica 9.

3. There is architectural difference in Informatica 9 compared to previous version.

4. Browser based tool for business analyst is a new feature.

5. Data steward is a new feature.

6. Allows unified administration with a new admin console that enables you to manage power centre and power exchange from the same console.

7. Powerful new capabilities for data quality.

8. Single admin console for data quality, power centre, power exchange and data services.

9. In Informatica 9, Informatica data quality (IDQ) has been further integrated with the Informatica Platform and performance, manageability and reusability have all been significantly enhanced.

10. The mappings rules are shared between the browser based tool for analysts and the eclipse based development leveraging unified metadata underneath.

11. The data services capabilities in Informatica 9 , both over sql and web services ,can be used for real time dash boarding.

12. Informatica data quality provides world wide address validation support with integrated geocoding.

13. The ability to define rules and view and run profiles is available in both the Informatica developer (Thick client) and Informatica analyst (browser based tool-Thin client).these tools sit on a unified metadata infrastructure. Both tools incorporate security features like authentication and authorization ensuring..

14. The developer tool is now eclipse based and supports both data integration and data quality for enhanced productivity. It provides browser based tool for analysts to support the types of tasks they engage in, such as profiling data, specifying and validating rules & monitoring data quality.

15. There will a velocity methodology. Soon it’s going to introduce on I9.

16. Informatica has the capability to pull data from IMS, DB2 on series and series and from other several other legacy systems (Mainframe) environment like VSAM, Datacom, and IDMS etc.

17. There are separate tools available for different roles. The Mapping architect for Vision tool is designed for architects and developers to create templates for common data integration patterns saving developer’s tremendous amount of time.

18. Informatica 9 does not include ESB infrastructure.

19. Informatica supports open interfaces such as web services and can integrate with other tools that support these as well including BPM tool.

20. Informatica 9 complements existing BI architectures by providing immediate access to data through data virtualization, which can supplement the data in existing data warehouse and operational data store.

21. Informatica 9 supports profiling of Mainframe data. Leveraging the Informatica platform’s connectivity to Mainframe sources.

22. Informatica 9 will continue support feature of running the same workflow simultaneously.

23. Eclipse based environment is build for developers.

24. Browser based tool is a fully functional interface for business analysts.

25. Dashboards are designed for business executives.

26. There are 3 interfaces through which these capabilities can be accessed. Analyst tool is a browsed tool for analyst and stewards. Developers can use the eclipse based developer tool. Line of business managers can view data quality scorecards.
newer post

Informatica 9 New Features

0 comments
Power center 9 new features.

PowerCenter Effective in version 9.0, PowerCenter contains new features and enhancements.

Integration Service

Session log file rollover. You can limit the size of session logs for real-time sessions. You can limit the size by time or by file size. You can also limit the number of log files for a session. 

Licensing

Enforcement of licensing restrictions. PowerCenter will enforce the licensing restrictions on the number of CPUs and repositories. 

Lookup Transformation

Cache updates. You can update the lookup cache based on the results of an expression. When an expression is true, you can add to or update the lookup cache. You can update the dynamic lookup cache with the results of an expression.

Database deadlock resilience. In previous releases, when the Integration Service encountered a database deadlock during a lookup, the session failed. Effective in 9.0, the session will not fail. When a deadlock occurs, the Integration Service attempts to run the last statement in a lookup. You can configure the number of retry attempts and time period between attempts.

Multiple rows return. You can configure the Lookup transformation to return all rows that match a lookup condition. A Lookup transformation is an active transformation when it can return more than one row for any given input row.

SQL overrides for uncached lookups. In previous versions you could create a SQL override for cached lookups only. You can create an SQL override for uncached lookup. You can include lookup ports in the SQL query.


Mapping Architect for Visio

New mapping objects. You can include the following objects in a mapping template:  Pipeline Normalizer transformation

Custom transformation

PowerExchange source definition

PowerExchange target definition


You can also create a mapping template from a mapping that contains these objects.

Shortcuts. You can configure a transformation to use a shortcut. You can create a mapping template from a mapping that contains shortcuts to reusable transformations.


SQL Transformation

Auto-commit for connections. You can enable auto-commit for each database connection. Each SQL statement in a query defines a transaction. A commit occurs when the SQL statement completes or the next statement is executed, whichever comes first.

Exactly-once processing. The Integration Service provides exactly-once delivery of real-time source messages to the SQL transformation. If there is an interruption in processing, the Integration Service can recover without requiring the message to be sent again. To perform exactly-once processing, the Integration Service stores a set of operations for a checkpoint in the PM_REC_STATE table.

Passive transformation. You can configure the SQL transformation to run in passive mode instead of active mode. When the SQL transformation runs in passive mode, the SQL transformation returns one output row for each input row.


XML Transformation

XML Parser buffer validation. The XML Parser transformation can validate an XML document against a schema. The XML Parser transformation routes invalid XML to an error port. When the XML is not valid, the XML Parser transformation routes the XML and the error messages to a separate output group that you can connect to a target.


Starting from Version 9 , powercenter admin console is called as informatica administrator.

Informatica Administrator (PowerCenter Administration Console) Effective in version 9.0, the PowerCenter Administration Console is renamed to Informatica Administrator. The Informatica Administrator has a new interface. Some of the properties and configuration tasks from the PowerCenter Administration Console have been moved to different locations in Informatica Administrator. The Informatica Administrator is expanded to include new services and objects.

Analyst Service. Application service that runs Informatica Analyst in the Informatica domain. Create and enable an Analyst Service on the Domain tab of Informatica Administrator. When you enable the Analyst Service, the Service Manager starts Informatica Analyst. You can open Informatica Analyst from Informatica Administrator.

Data Integration Service. Application service that processes requests from Informatica Analyst and Informatica Developer to preview or run data profiles and mappings. It also generates data previews for SQL data services and runs SQL queries against the virtual views in an SQL data service. Create and enable a Data Integration Service on the Domain tab of Informatica Administrator.

Model Repository Service. Application service that manages the Model repository. The Model repository is a relational database that stores the metadata for projects created in Informatica Analyst and Informatica Designer. The Model repository also stores run-time and configuration information for applications deployed to a Data Integration Service. Create and enable a Model Repository Service on the Domain tab of Informatica Administrator.

PowerExchange Listener Service. Manages the PowerExchange Listener for bulk data movement and change data capture. The PowerCenter Integration Service connects to the PowerExchange Listener through the Listener Service.

PowerExchange Logger Service. Manages the PowerExchange Logger for Linux, UNIX, and Windows to capture change data and write it to the PowerExchange Logger Log files. Change data can originate from DB2 recovery logs, Oracle redo logs, a Microsoft SQL Server distribution database, or data sources on an i5/OS or z/OS system.

Connection management. Database connections are centralized in the domain. You can create and view database connections in Informatica Administrator, Informatica Developer, or Informatica Analyst. Create, view, edit, and grant permissions on database connections in Informatica Administrator.

Deployment. You can deploy, enable, and configure deployment units in the Informatica Administrator. Deploy Deployment units to one or more Data Integration Services. Create deployment units in Informatica Developer.

Monitoring. You can monitor profile jobs, scorecard jobs, preview jobs, mapping jobs, and SQL Data Services for each Data Integration Service. View the status of each monitored object on the Monitoring tab of Informatica Administrator.
newer post

Architectural difference between Informatica 9 and Informatica 8.x

0 comments
Lookup Transformation: Cache updates. We can update the lookup cache based on the results of an expression. When an expression is true, We can add to or update the lookup cache. We can update the dynamic lookup cache with the results of an expression.
Multiple rows return: We can configure the Lookup transformation to return all rows that match a lookup condition. A Lookup transformation is an active transformation when it can return more than one row for any given input row.
SQL overrides for uncached lookups: In previous versions We could create a SQL override for cached lookups only. We can create an SQL override for uncached lookup. We can include lookup ports in the SQL query.
Database deadlock resilience: In previous releases, when the Integration Service encountered a database deadlock during a lookup, the session failed. Effective in 9.0, the session will not fail. When a deadlock occurs, the Integration Service attempts to run the last statement in a lookup. We can configure the number of retry attempts and time period between attempts.
SQL transformation: Auto-commit for connections. We can enable auto-commit for each database connection. Each SQL statement in a query defines a transaction. A commit occurs when the SQL statement completes or the next statement is executed, whichever comes first.
Session Log files rollover: We can limit the size of session logs for real-time sessions. We can limit the size by time or by file size. We can also limit the number of log files for a session.
Passive transformation: We can configure the SQL transformation to run in passive mode instead of active mode. When the SQL transformation runs in passive mode, the SQL transformation returns one output row for each input row.
XML transformation: XML Parser buffer validation. The XML Parser transformation can validate an XML document against a schema. The XML Parser transformation routes invalid XML to an error port. When the XML is not valid, the XML Parser transformation routes the XML and the error messages to a separate output group that We can connect to a target.
Model Repository Service: Application service that manages the Model repository. The Model repository is a relational database that stores the metadata for projects created in Informatica Analyst and Informatica Designer. The Model repository also stores run-time and configuration information for applications deployed to a Data.
Integration Service: Create and enable a Model Repository Service on the Domain tab of Informatica Administrator.
Connection management: Database connections are centralized in the domain. We can create and view database connections in Informatica Administrator, Informatica Developer, or Informatica Analyst. Create, view, edit, and grant permissions on database connections in Informatica Administrator.
Deployment: We can deploy, enable, and configure deployment units in the Informatica Administrator. Deploy Deployment units to one or more Data Integration Services. Create deployment units in Informatica Developer.
Monitoring: We can monitor profile jobs, scorecard jobs, preview jobs, mapping jobs, and SQL Data Services for each Data Integration Service. View the status of each monitored object on the Monitoring tab of Informatica Administrator.
newer post
newer post older post Home