Declarative workflows for building Spark Streaming
Spark Streaming
Spark Streaming is an extension of the core Spark API that enables stream processing from a variety of sources.
Spark is a extensible and programmable framework for massive distributed processing of datasets,
called Resilient Distributed Datasets (RDD). Spark Streaming receives input data streams and divides the data into batches, which are then processed by the Spark engine to generate the results.
Spark Streaming data is organized into a sequence of DStreams,
represented internally as a sequence of RDDs.
StreamingPro
StreamingPro is not a complete application, but rather a extensible and programmable framework for spark streaming (also include spark,storm)
that can easily be used to build your streaming application.
StreamingPro also make it possible that all you should do to build streaming program is assembling components(eg. SQL Component) in configuration file.
Features
- Pure Spark Streaming(Or normal Spark) program (Storm in future)
- No need of coding, only declarative workflows
- Rest API for interactive
- SQL-Oriented workflows support
- Data continuously streamed in & processed in near real-time
- dynamically CURD of workflows at runtime via Rest API
- Flexible workflows (input, output, parsers, etc...)
- High performance
- Scalable
Documents
- Properties
- Build
- Run your first application
- Submit application
- dynamically CURD of workflows at runtime via Rest API
- Recovery
- Useful modules introduction
- Other runtime support
网友评论
依赖的jar我直接放到了spark/lib 下,启动的时候还是报
Caused by: java.lang.ClassNotFoundException: net.csdn.common.logging.Loggers
net.csdn.common.jar 我已经在本地编译了。
按照online的方式打的包,在本地测试。