Hadoop Happenings: Apache Falcon Graduates

By January 27, 2015

Hadoop-HappeningsGrab the latest news and commentary about Hadoop in this week’s Hadoop Happenings. This week the Apache Software Foundation announced Apache Falcon has graduated to a top-level project. Hortonworks’ distribution is now available on the Google Cloud Platform, and Netflix is open sourcing some of its analytics tools.

1. Netflix is open sourcing tools for analyzing data in Hadoop Netflix is open sourcing some of its tools in a project called Suru, including a Pig function called ScorePMML. Read More

2. Machine Learning + Big Data This post breaks down big data and machine learning into three phases: collect, analyze and predict. It also discusses the role of Hadoop and Spark in the big data analytics picture. Read More

3. Big Data Digest: How Many Hadoops do we need? The Apache Software Foundation announced Apache Flink, an engine that can ingest both batch and streaming data. However, analysts question whether the project can compete with Apache Spark and Hadoop. Read More

4. The Apache Software Foundation Announces Apache Falcon as a Top-Level Project Apache Falcon, a data processing and management solution for Hadoop, has graduated to a Top-Level Project. Read More

5. The Google cloud makes room for Hortonworks’ Hadoop distribution Hortonworks announced its distribution is now available on the Google Cloud Platform. Read More

6. Apache Flink: Possible replacement for Hadoop? Apache Flink is similar to Hadoop except it can also analyze streaming data and leverage in-memory processing. Read More

7. Apache Falcon gets top-level status, filling gap in Hadoop ecosystem Apache Falcon is a high-level framework that implements automated controls to manage the flow of data. Read More

8. 6 Companies Using Big Data to Change Business Startups specializing in business intelligence continue to spring up as enterprises focus on becoming more data-driven. Read MoreĀ 

9. Lessons from the First Wave of Hadoop Adoption Dan Woods comments on four lessons he has learned during the early adoption of Hadoop, including the complexity of the big data system. Read More

10. It’s not just about Hadoop core anymore Organizations are moving beyond using solely MapReduce for their big data projects and are adopting a broader set of tools within the Hadoop ecosystem. Read More

Share our Post

Leave a Reply

Your email address will not be published. Required fields are marked *