Skip to content

legacy Customizing

Fabiano V. Santos edited this page Jun 13, 2017 · 1 revision

Customizing

Provider

Custom providers can be created by annotating your provider annotation with @ExecutorProvider and @BindingAnnotation:

@BindingAnnotation
@Target(ElementType.TYPE)
@Retention(RetentionPolicy.RUNTIME)
@ExecutorProvider(
        provider = MyStreamingContextProvider.class,
        module = MyModuleProvider.class,
        additionalModules = {MyAdditionalModule.class})
public @interface MyCustomProvider {
}

The @ExecutorProvider configurations are described below:

  • provider: a class that implements com.google.inject.Provider<T>, this class must create a JavaStreamingContext for streams (see KafkaHighLevelStreamingContextProvider as example) or an RDD for batches.
  • module: the module class is used to create the binding for tasks. So with it you can create your own task processor. It is recommended to use StreamingModuleProvider as module for streams, for batches you should use BatchTaskProvider.
  • additionalModules: additional modules to bind custom beans.

See How to use for more examples.

Nightfall Module

The module class has the responsibility to bind the executor with the data source. We can take the FileRDD as an example, which has a FileRDDProvider as provider and a DataPointBatchTaskProvider as module.

The DataPointBatchTaskProvider will bind the execution of tasks with DataPointBatchTaskExecutor, thus, only the tasks annotated with @Task that implements the interface BatchTaskProcessor of the DataPoint<String> type, which is passed to the super class AbstractNightfallModule.

Also the contextProvider is injected using a Named due to problems with generics binding. This is done by NightfallBootStrapModule:

    // Workaround to bind provider within ModuleProvider
    binder
        .bind(String.class)
        .annotatedWith(Names.named("contextProvider"))
        .toInstance(provider.getName());