Publishing
Overview Copied
The Publishing page allows you to configure publishing of normalised metric and event data to a downstream Kafka instance.
Publishing from Gateway Hub has advantages over publishing directly from the Gateway. Gateway outputs data as human readable strings that are difficult to consume programmatically, for this reason Gateway Hub normalises metric and event data to a standardised JSON format.
Only one downstream Kafka instance is supported by a Gateway Hub cluster.
To open this page, go to Administration > Publishing in the Web Console side panel.
Prerequisites Copied
Geneos components Copied
Your Gateway Hub must be connected to a Geneos Gateway 4.10.x or higher to use publishing.
Kafka Copied
Gateway Hub can only publish to topics that already exist on your downstream Kafka instance. Therefore, you must create the following topics, where ${prefix}
is the topic prefix configured in the Publishing section of the Web Console:
${prefix}metrics
${prefix}events
${prefix}entities
Note
The default prefix isitrs-
.
Publishing from Gateway Hub has been tested with downstream ApacheKafka instances 2.0.0 and higher.
Kerberos Copied
If you want to use Kerberos to connect to Kafka, you must obtain:
- The Kerberos principal used to connect to Kafka.
- The keytab file encoding the password for the Kerberos principal.
On every Gateway Hub node you must also update the Kerberos configuration file /etc/krb5.conf
to contain the correct configuration information for your Kerberos domain. In particular, the [realms]
section must contain the correct Kerberos server information.
If you are using SSL encryption together with Kerberos authentication, you must also obtain the CA certificate used to sign the Kafka broker’s public keys.
Status Copied
If you configure Kafka publishing, an information bar is displayed at the top of this configuration screen providing the current publishing status. Important information such as the inability to connect to the configured Kafka instance is displayed here.
Note
Retrieving the status from a remote Kafka instance can take more than 30 seconds depending on various factors such as the connection speed, the relative location, the processing capability, and more.
The possible statuses are as follows:
- Configuration error
- Waiting
- Success
Kafka Publisher Configuration Copied
This section is where you enter the details of your downstream Kafka instance, and any additional settings.
Enable or disable publishing using the toggle to the right of Publishing.
Field | Description |
---|---|
Bootstrap servers |
Host and port values that the Gateway Hub uses to establish connection to Kafka. Click add button to add more rows. |
Topic prefix | Optional prefix. This allows you to avoid collisions with existing Kafka topic names. The default is itrs- . |
Producer configuration name | Name of an additional Kafka setting. You can use any setting (other than callbacks) defined in the Apache Kafka documentation . Add more rows as needed using the add button. |
Producer configuration value | Value associated with setting defined in Producer Configuration Name. |
Click Request Snapshot to request a snapshot of all metric data.
Click Request Schema to request all schemas.
Security protocol Copied
This section is where you select and configure the security protocols. The options are:
PLAINTEXT
SASL_PLAINTEXT
SSL
SSL_CLIENT_AUTH
SASL_SSL
If you select SASL_SSL
, then the following configuration options are availableappear:
Field | Description |
---|---|
CA certificate |
Click Upload File to select the certificate used to sign the Kafka broker's public keys. |
Kerberos principal | Principal name used for Kerberos. |
Kerberos keytab | The keytab file encoding the password for the Kerberos principal. |
Filtering Copied
Filtering allows you to publish a smaller subset of your data. This can considerably reduce the storage and processing requirements of a downstream application.
You can use one or more filter predicates to filter the data by message type and entity query. A record is published if it meets any of the includes conditions. However, a record is not published if it meets any of the excludes conditions even if it also meets an includes condition.
For example, given the configuration below, all metrics and events will be published for Entities where Application = Fidessa except where Department = Fixed Income
Include/Exclude | Message type | Entities |
---|---|---|
Include | Events | Application = Fidessa |
Include | Metrics | Application = Fidessa |
Exclude | All | Department = ‘Fixed Income’ |
You can specify filters on entities using the basic or advance menus:
- The basic menu allows you to select entities by specifying an attribute and a value.
- The advanced menu allows you to input a filter manually.
To create a filter using entities:
- In the Filtering section, click New Filter. This opens the New filter window.
- In the Entities field, click add.
- Select the attribute and its corresponding value. The entities matching the query generates a value.
- Click Add Filter.