Skip to main content

Complete examples

This section helps you to set up a stream and start sending data to STRM Privacy.

Setting up a stream

This section assumes that you have created an account on the console.

Using the programming language examples.

In order to run these examples, you need the following:

  • An input stream to send data to (if you don’t know how, go here to learn how to create streams)

  • The credentials for this stream (presented upon stream creation). Either keep note of the returned values from the strm create stream command, or use --save flag to store them in the ~/.config/strmprivacy/Stream directory.

The following demo applications show how dummy data can be sent with a certain frequency. The data that is sent is quite static and does not result in any useful patterns for analysis, however, it does show how data can be constructed and transferred to STRM Privacy.


Currently (Aug. 2021) every example language has a different configuration file format. This is inconvenient and will be fixed. We aim to standardize this to the format created with strm create stream (stream-name) --save, so that getting up-and-running becomes easier.

java-driver java-avro

This example is also available on GitHub . Please see the repository for the readme.

Short steps to start sending data:

git clone
cd java-examples
strm create stream demo --save
f=$( strm context info Stream/demo )
billingId=$(cat $f | jq -r '.ref.billingId')
clientId=$(cat $f | jq -r '.credentials[0].clientId')
clientSecret=$(cat $f | jq -r '.credentials[0].clientSecret')
mvn package
java -jar target/java-examples-0.0.1-SNAPSHOT-jar-with-dependencies.jar \
$billingId $clientId $clientSecret
org.eclipse.jetty.util.log                  - Logging initialized ...
io.strmprivacy.driver.client.AuthService - Initializing a new Auth Provider
io.strmprivacy.examples.Sender - 204
io.strmprivacy.examples.Sender - 204
io.strmprivacy.examples.Sender - 204


Receiving data

See strm listen web-socket for a debugging view on the events.

See exporting to Kafka or batch exporters for production event consuming.