Informatica, the world’s number one independent software provider focused on delivering transformative innovation for the future of all things data, today announced its $1 Million Software and Services Big Data Ready Challenge to help organizations overcome the challenges of taking their big data projects from pilot to production. Through the contest, Informatica will award a total of at least US$1 million in software and services to help qualifying organizations in North America deliver clear business value from their big data initiatives. Six finalists and two winners each quarter will be selected during the next 12 months, and a grand prize winner will be picked at the Big Data Ready Fall Summit in December 2016.
The judges for the Contest include:
— Professor Tom Davenport, President’s Distinguished Professor of Information Technology and Management at Babson College.
— Dr. Claudia Imhoff, President and Founder of the Boulder BI Brain Trust and Intelligent Solutions, Inc. A thought leader, visionary, and practitioner in the rapidly growing fields of business intelligence and advanced analytics.
— David S. Linthicum is a consultant at Cloud Technology Partners and an internationally recognized industry expert and thought leader.
— Mark Smith, CEO and Executive Chief Research Officer, Ventana Research.
— Amit Walia, EVP and Chief Product Officer, Informatica.
— Ray Wang, Principal Analyst, Founder and Chairman, Constellation Research.
Additionally, Informatica introduced the Informatica Big Data Management framework, the most comprehensive hybrid data management approach to manage all things big data. Informatica Big Data Management framework is a holistic approach to managing big data, which includes a comprehensive view focused on three pillars required for big data management solutions.
Big data integration must deliver high-throughput data ingestion and at-scale processing so business analysts can make better decisions using next-generation analytics tools. Big data integration helps businesses gain better insights from big data because it:
- Speeds up development, leverages existing IT skills, and simplifies maintenance through the use of a simple visual interface supported by easy-to-use templates.
- Increases performance and resource utilization by optimizing data processing execution and providing flexible, hybrid deployment across a variety of platforms.
- Handles a wide variety of data sources though hundreds of pre-built transformations and connectors, and orchestrates data flows by using broker-based data ingestion.
Data Governance and Data Quality
End-to-end big data governance and quality means business and IT users can be confident with the data they are using. Comprehensive data governance includes:
- Formal data quality assessments to detect data anomalies sooner.
- Pre-built data quality rules to ensure data is “fit-for-purpose.”
- Universal metadata catalog to facilitate search and automate data processing.
- Entity matching and linking to enrich master data, such as for customers.
- End-to-end data lineage for data provenance, traceability and compliance audits.
Risk-centric big data security analyzes all data to quickly detect and act upon risks and vulnerabilities. This requires a 360-degree view of sensitive data, supported by risk analytics and policy-based protection of data at risk. Additionally, data preparation is required to ensure that big data is consistent and of high quality. Big data security should de-identify information controlled by corporate policies and industry regulations. Risk-centric big data security and data preparation must enable:
- “Single pane of glass” monitoring of sensitive data stores to provide visibility into the locations of sensitive data.
- Sensitive data discovery and classification for a comprehensive 360-degree view of sensitive data.
- Usage and proliferation analysis for a precise understanding of data risk.
- The integration of disparate applications to easily process and analyze data, leading to insights that help organizations make better decisions.
- Risk assessment to help prioritize investments in security programs.
- Non-intrusive persistent and dynamic data masking to protect sensitive data in development and production environments to help minimize the risk of a security breach.