Skip to content

Stream event, log and sample environment data from a NeXus file to a Kafka cluster - Superseded by

License

BSD-2-Clause, MIT licenses found

Licenses found

BSD-2-Clause
LICENSE
MIT
LICENSE.DownloadProject
Notifications You must be signed in to change notification settings

ScreamingUdder/isis_nexus_streamer_for_mantid

Repository files navigation

Build Status Build Status Coverage Status License (2-Clause BSD)

ISIS NeXus Streamer for Mantid

This software is superseded by https://github.com/ess-dmsc/NeXus-Streamer

Stream event data from a NeXus file from RAL/ISIS using Apache Kafka for the purpose of development of live data streaming in Mantid. Each message sent over Kafka comprises the event data from a single neutron pulse.

The client runs until the user terminates it, repeatedly sending data from the same file but with incrementing run numbers. However the -z flag can be used to produce only a single run.

Usage:

main_nexusPublisher -f <filepath>    Full file path of nexus file to stream
-d <det_spec_map_filepath>    Full file path of file defining the det-spec mapping
-b <host>    Broker IP address or hostname
[-i <instrument_name>]    Used as prefix for topic names
[-s]    Slow mode, publishes data at approx realistic rate of 10 frames per second
[-q]    Quiet mode, makes publisher less chatty on stdout
[-z]    Produce only a single run (otherwise repeats until interrupted)

Usage example:

main_nexusPublisher -f /path/to/isis_nexus_streamer_for_mantid.git/data/SANS_test_uncompressed.hdf5 -d /path/to/isis_nexus_streamer_for_mantid.git/data/spectrum_gastubes_01.dat -b localhost -i SANS2D -z

Broker Configuration

Timestamped "run" start and stop messages are produced. With these Mantid can join the stream at the start of a run and has various options for behaviour at run stop. This makes use of the offset by timestamp lookup feature and thus requires Kafka version >0.10.2.0 on the brokers. It is also important to allow larger than the default message size by adding the following to the kafka configuration file (server.properties):

replica.fetch.max.bytes=10000000
message.max.bytes=10000000

We use this Ansible playbook to deploy Kafka: https://github.com/ScreamingUdder/ansible-kafka-centos

Containers

The docker-compose script can be used to launch a single-broker Kafka cluster and the NeXus streamer. Run the following in the root directory of the repository to launch the containers.

docker-compose up

The streamer publishes some test data using the instrument name TEST. The Kafka broker is accessible at localhost:9092.

Dependencies

Dependencies can be installed using Conan. Conan can be installed using pip. The following remote repositories are required to be configured:

You can add them by running

conan remote add <local-name> <remote-url>

where <local-name> must be substituted by a locally unique name. Configured remotes can be listed with conan remote list.

Google Test and Google Mock are used for unit testing but are not required to be installed; CMake will download and build them at configure-time.

Build

The first line can be omitted if you have librdkafka v0.11.1 and libhdf5 v1.10 available on your system.

conan install <path-to-source>/conan --build -s compiler.libcxx=libstdc++11
cmake <path-to-source>
make

Unit tests

The unit test executable unitTests needs to be passed the path of the test data directory as an argument. Alternatively, run all units tests using ctest with

ctest -VV

from the build directory.

Schema

The schema files are located in https://github.com/ess-dmsc/streaming-data-types

About

Stream event, log and sample environment data from a NeXus file to a Kafka cluster - Superseded by

Resources

License

BSD-2-Clause, MIT licenses found

Licenses found

BSD-2-Clause
LICENSE
MIT
LICENSE.DownloadProject

Stars

Watchers

Forks

Packages

No packages published