Informatica, the world’s number one independent software provider focused on delivering transformative innovation for the future of all things data, today announced Informatica Big Data Management, the industry’s first big data management solution that brings together Big Data Integration, Big Data Quality and Governance, and Big Data Security in a single integrated solution. While more than 67 percent of enterprises see big opportunity in big data, the majority of big data projects fail. The new Informatica solution enables organizations to effectively overcome the data management challenges that are frequently at the root of big data project failures. Informatica Big Data Management also dramatically reduces the need for hand-coding and big data skill sets that are expensive and hard to come by.
In September, Informatica introduced the Informatica Big Data Management framework, a comprehensive approach to managing all things big data, encompassing the three data management pillars: data integration, data quality and governance, and data security. Informatica Big Data Management delivers the technologies and capabilities to execute on this comprehensive framework and drive big data success.
“Data is the lifeblood of business, and only Informatica does end-to-end data management for big data,” said Anil Chakravarthy, acting chief executive officer, Informatica. “Big data represents the next frontier of competitive differentiation, superior customer experiences and business innovation. From driving rapid project implementations to ensuring confidence in the data and the safety of sensitive information, Informatica Big Data Management empowers business and IT leadership with unparalleled automation, pre-built tools and optimized capabilities. This allows for quick experimentation and seamless, mission-critical production deployments that deliver maximum business value from big data.”
Big Data Management – Three Integrated Components
Dynamic, At-Scale Big Data Integration
Too much valuable time is spent on tapping a wide array of data sources from traditional to streaming data and bringing it into Hadoop. An effective data pipeline, from ingest to publishing, is a cornerstone of every big data implementation. Informatica Big Data Integration enables organizations to:
• Ingest nearly instantly
o Universal connectivity – More than 200 pre-built, high performance Informatica Connectors enable any type of data to be quickly ingested to big data platforms, such as Hadoop, NoSQL and MPP appliances.
o High throughput, low latency data integration – Mass ingestion and real-time streaming enables highest throughput and low latency data integration.
• Process everything
o Out of the box scalable processing – More than 100 pre-built data integration and data quality transformations and parsers run natively on Hadoop to enable scalable processing of large data sets.
o Automated data integration processes – Dynamic mappings and parameterization enable programmatic automation of data integration processes.
o Visual graphical development – Data pipelines can be developed up to 5x faster over hand-coding using a visual development environment.
• Deploy optimally
o Easy provisioning – Wizards and mapping templates enable easy provisioning of data from thousands of sources into a data lake or operational data store. Productivity and ease-of-maintenance is dramatically improved by automatically generating whole classes of data flows at runtime based on design patterns using just a handful of templates.
o Adaptability to changing environments – Dynamic schema support enables connectivity to flexible data formats.
o Optimized engines – Delivers maximum performance and resource utilization for at-scale data integration. Informatica optimizes big data workloads by using MapReduce and the new Informatica Blaze engine via YARN.
Holistic Big Data Quality and Governance
Big data presents sizable quality and governance challenges, making it difficult for organizations to trust. The needs for data quality are changing since the same data is being used for multiple purposes. Additionally, because everything and everyone is interconnected, there are often hidden relationships that once discovered can yield tremendous insights. Trust issues are also magnified due to new sources of external data. Informatica Big Data Management enables organizations to easily address these challenges and turn big data into an opportunity to drive business value by ensuring its transparency, auditability, agility and trust.
Specifically, Informatica Big Data Quality and Governance offers organizations:
• Collaborative stewardship
o IT/business collaboration – Intuitive non-technical user experience empowers analysts and data stewards to participate effectively in holistic data stewardship processes, while comprehensive business process management capabilities promote collaboration between business and IT stakeholders.
o Big data profiling, discovery and alerting – Data profiling and discovery, including business rule profiling, highlights data quality issues and anomalies, while monitoring rules and alerts can be easily created to track and flag quality issues.
• 360-degree insight
o 360-degree relationship discovery – Enables high performance and flexible holistic relationship discovery (parties, households, etc.) across big data environments.
o Live data map – a universal metadata catalog and knowledge graph to search, discover and understand enterprise data, leverages Spark for at-scale fast knowledge graph creation.
• Complete confidence
o Highly scalable data quality processes – Data validation, enrichment and de-duplication can be deployed on Hadoop for scale.
o Comprehensive auditing and analysis – End-to-end visibility into data lineage beyond Hadoop supports compliance and enables effective data quality root-cause and impact analysis.
Risk-Centric Big Data Security
With the growing dispersion of big data, organizations are increasingly challenged to understand where their sensitive data resides and what data assets can be trusted. Informatica Big Data Management discovers sensitive data, its proliferation, usage, provenance and protection status to analyze and visualize sensitive data risk and vulnerabilities. It protects sensitive data by de-identifying and de-sensitizing information governed by corporate policies and industry regulations. Informatica Big Data Security provides organizations with:
• Full spectrum visibility
o Discovery of sensitive data with context – Organizations can see who has access to sensitive data, who is actually accessing it, whether it is protected and where it is proliferating, including tracing data flows, lineage and history.
o Visualizations – Visualizations and reports identify sensitive data by geography, function and ranking attributes.
• Risk Analytics
o Risk scoring – Determine sensitive data risk by analyzing location, proliferation, cost, protection status and use to highlight vulnerabilities for remediation.
o Sensitive data discovery – Sensitive data profiling, discovery and analysis enables organizations to clearly understand their big data security risks.
o Active alerting – Alerts inform administrators and security professionals of high risk conditions.
• Policy-based protection
o De-identification for applications, test environments, reporting and analytics – Leverage centralized policy management to protect and secure sensitive data. Dynamic data masking de-identifies sensitive data in production environment based on user roles and privileges. Persistent data masking protects live and archived sensitive data in non-production environments such as test, development and training.
Informatica Big Data Management is available today via subscription. For further information, visit https://informatica.com/products/big-data/big-data-edition.html.