Cosmos - Big Data Analysis
The Cosmos Big Data Analysis GE is a set of tools that help achieving the tasks of Streaming and Batch processing over context data. These tools are:
- Orion-Flink Connector (Source and Sink)
- Apache Flink Processing Engine
- Apache Spark Processing Engine (work in progress)
- Streaming processing examples using Orion Context Broker
Lesson 1. Big Data Fundamentals
By following this course you will learn about Big Data fundamentals (distributed storing and computing), Map and Reduce basics, Hadoop and its ecosystem.
Lesson 2. HDFS and Map Reduce
By following this course you will be able to use the Global Instance of Cosmos in FIWARE Lab. Basically, doing I/O with Web HDFS API and submitting and managing MapReduce jobs with Tidopp API.
Lesson 3. Big Data Analysis
This video presentation explains context information processing and advanced Big Data Analysis.
Use of the Cosmos Flink Connector is described in the following step-by-step tutorial:
This GitHub repository exists containing a few examples for getting started with the cosmos flink connector is described in the following documentation: