Red Hat unveils big data, open hybrid cloud direction
Red Hat Inc Monday announced its big data direction and solutions to meet demands of enterprise which require highly reliable, scalable, and manageable solutions to effectively run their big data analytics workloads.
Red Hat announced that the company will contribute its Red Hat Storage Hadoop plug-in to the Apache Hadoop open community to transform Red Hat Storage into a fully-supported, Hadoop-compatible file system for big data environments, and that Red Hat is building a robust network of ecosystem and enterprise integration partners to deliver comprehensive big data solutions to enterprise customers.
Red Hat big data infrastructure and application platforms are ideally suited for enterprises leveraging the open hybrid cloud environment, it announced.
Red Hat is working with the open cloud community to support big data customers. Many enterprises worldwide use public cloud infrastructure, such as Amazon Web Services (AWS), for the development, proof-of-concept, and pre-production phases of their big data projects, it said.
Red Hat is actively engaged in the open cloud community through projects like OpenStack and OpenShift Origin to help meet these enterprise big data expectations both today and in the future.
It said there are several Red Hat solutions available to effectively manage enterprise big data workloads. Focused on three primary areas, Red Hat's big data direction includes extending its product portfolio to deliver enhanced enterprise-class infrastructure solutions and application platforms, and partnering with leading big data analytics vendors and integrators.
Ranga Rangachari, vice president and general manager, Storage, Red Hat said:
"With today's announcement, Red Hat demonstrates its strong commitment to continue to provide enterprise infrastructure and platforms to effectively run big data applications today and in the growing open hybrid cloud environment. With true enterprise-class offerings, Red Hat leverages the power of the open source community to give our big data customers a choice in technology, deployment environments, and partners."
Red Hat said its Big Data Infrastructure Solutions are
- Red Hat Enterprise Linux - According to the Jan. 2012 The Linux Foundation Enterprise Linux User Report, the majority of big data implementations run on Linux and as the leading provider of commercial Linux, Red Hat Enterprise Linux is a leading platform for big data deployments. Red Hat Enterprise Linux excels in distributed architectures and includes features that address critical big data needs. Managing tremendous data volumes and intensive analytic processing requires an infrastructure designed for high performance, reliability, fine-grained resource management, and scale-out storage. Red Hat Enterprise Linux addresses these challenges while adding the ability to develop, integrate, and secure big data applications reliably and scale easily to keep up with the pace that data is generated, analyzed, or transferred. This can be accomplished in the cloud, making it easier to store, aggregate, normalize, and integrate data from sources across multiple platforms, whether they are deployed as physical, virtual, or cloud-based resources.
- Red Hat Storage - Built on the trusted Red Hat Enterprise Linux operating system and the proven GlusterFS distributed file system, Red Hat Storage Servers can be used to pool inexpensive commodity servers to provide a cost-effective, scalable, and reliable storage solution for big data.
Red Hat said it intended to make its Hadoop plug-in for Red Hat Storage available to the Hadoop community later this year.
Currently in technology preview, the Red Hat Storage Apache Hadoop plug-in provides a new storage option for enterprise Hadoop deployments that delivers enterprise storage features while maintaining the API compatibility and local data access the Hadoop community expects. Red Hat Storage brings enterprise-class features to big data environments, such as Geo replication, High Availability, POSIX compliance, disaster recovery, and management, without compromising API compatibility and data locality. Customers now have a unified data and scale out storage software platform to accommodate files and objects deployed across physical, virtual, public and hybrid cloud resources.
- Red Hat Enterprise Virtualization - Announced in Dec. 2012, Red Hat Enterprise Virtualization 3.1 is integrated with Red Hat Storage, enabling it to access the secure, shared storage pool managed by Red Hat Storage. This integration also offers enterprises reduced operational costs, expanded portability, choice of infrastructure, scalability, availability and the power of community-driven innovation with the contributions of the open source oVirt and Gluster projects. The combination of these platforms furthers Red Hat's open hybrid cloud vision of an integrated and converged Red Hat Storage and Red Hat Enterprise Virtualization node that serves both compute and storage resources.
Red Hat's Big Data Application and Integration Platforms are:
- Red Hat JBoss Middleware - Red Hat JBoss Middleware provides enterprises with powerful technologies for creating and integrating big data-driven applications that are able to interact with new and emerging technologies like Hadoop or MongoDB. Big data is only valuable when businesses can extract information and respond intelligently. Red Hat JBoss Middleware solutions can populate large volumes and varieties of data quickly and reliably into Hadoop with high speed messaging technologies; simplify working with MongoDB through Hibernate OGM; process large volumes of data quickly and easily with Red Hat JBoss Data Grid; access Hadoop along with your traditional data sources with JBoss Enterprise Data Services Platform; and identify opportunities and threats through pattern recognition with JBoss Enterprise BRMS. Red Hat's middleware portfolio is well-suited to help enterprises seize the opportunities of big data.