MDM Writing Samplers and Feed Adapters - User guide
API Concepts
The diagram below shows the major components within the Market Data Monitoring plug-in and how they interact with each other. The black arrows indicate the direction of data flow. The Feed Controller component is shared by all MDM samplers and manages the Netprobe's connections to market data feed adapters. The MDM APIs are accessed as Lua modules and are described in detail in the next section.
Lua modules
Data Subscription API
This module allows an MDM script to connect to market data sources and subscribe for instrument data. The market data is provided as collections of normalised and timestamped tick data. Subscriptions are registered with the Feed Controller, which communicates with vendor-specific adapter libraries using the Feed Adapter API. A number of popular market data vendor types are supported out of the box; connectivity to proprietary or unsupported data sources can be enabled by creating custom adapter libraries.
Data Analysis API
The market data module provides full programmatic access to the subscribed data and all the fields including the timestamp. This can be used in combination with the analytic capabilities of the Lua programming language to provide a rich and highly customisable analysis capability.
Additionally, MDM provides built-in support for relative latency calculations via the Latency module:
This module provides a Lua implementation of a tick matching algorithm based upon the Longest Common Subsequence algorithm. This algorithm calculates the relative latency between two feeds, by matching equivalent ticks within the data and calculating the difference in arrival time of messages. (In other words, determine on which feed did the data arrive earliest, and how long the lag was on the other feed). Statistics on the number of ticks, matches and computed latency are returned, which can then be published into Geneos.
The latency module can be customised in several ways. In particular, comparison of ticks (to determine whether they match) can be customised to prioritise, ignore or even recalculate selected tick fields. Latency metrics can also be adapted or extended to compute additional statistics from the algorithm output.
This sub-module provides support for logging the input and output of the latency algorithm.
Publishing API
This module provides the mechanisms that allow an MDM script to act as a Geneos sampler, publishing content to the Gateway. Dataviews can be created and updated directly from Lua code and commands can be published to create points of user interaction and workflows within the monitoring.
This sub module provides a fluent interface for constructing commands, allowing complex definitions to be created using legible and maintainable code.
Sampler life cycle
Initialisation
When a sampler is started using the MDM plug-in, a new Lua environment (a Lua 'state') is created, and the specified Lua script is loaded into this environment and executed. At this point the script's task is to initialise the sampler; it will normally:
- Connect to one or more market data sources.
- Create one or more dataviews.
- Optionally define and publish commands, associating them with targets such as dataviews or cells.
- Define a
doSample
function, to process the data gathered by the data sources each time the sampler is due to take a sample. - Optionally define a
finalise
function, to be called when the sampler is stopped - Exit, allowing the Netprobe to continue configuring and running plug-ins.
(In fact, only defining doSample
and exiting are required,
but a useful sampler will almost certainly do the first
two as well.)
Running
From then on, MDM will invoke the doSample
function defined by the
script every sample (which occurs regularly at the
configured sampling interval or if a user invokes the
"Sample Now" command). This function should
typically:
- Collect data from the data sources.
- Make any required updates to the dataviews.
Meanwhile, if a user invokes one of the commands published by the sampler, MDM calls the function provided by the script when it published the command. This function does whatever is required to execute the user command. For instance it might: add or remove market data subscriptions, reset one or more metrics, add or remove rows in a dataview or create an on-demand report.
Finalisation
The Lua environment created when the sampler is started persists until the sampler is stopped, by being removed from the configuration or when the Netprobe itself is stopped.
If the sampler script has defined a finalise
function, this called just
before the Lua environment is destroyed. This function
might close off a report or save closing values to a
file, to be reloaded when the sampler is next
started.
Errors
In Lua, errors can be
raised deliberately, by calling the Lua error
function, or unintentionally, for example by deferencing
a nil
variable. Lua also provides a
mechanism (the pcall
function) to allow errors to be caught and handled. If a
Lua error is raised and not handled by an MDM sampler
script, the MDM plug-in will catch the error and write a
message to the Netprobe log. Such errors are handled
differently depending upon the context of the error.
- If the script errors during sampler initialisation,
or if the script exits without defining a
doSample
function, no sampling will be performed. Instead, an 'ERROR' dataview will be created with asamplingStatus
headline value describing the error. - When the
doSample
function is called, then the first four consecutive failures will simply be logged, but after the fifth failure sampling will be halted. In this case, sampling can be resumed again by running the "Sample Now" command on that sampler. - When a command is executed, if MDM catches an error, it is reported via the user interface as for command output. (In Active Console, a tab displaying the error message is added to the Output window.)
- Errors that cause the
finalise
function to exit are logged but otherwise ignored.
Create a Lua script
A skeleton sampler
To run without reporting an error, an MDM script must at
least declare a doSample
function. Therefore, with the
addition of some place-holder comments, the minimal script
looks like this:
-- Load the API modules used by the script
local gs = require 'geneos.sampler'
-- Process configuration parameters
-- ================================
-- TODO
-- Configure and connect to market data sources
-- ============================================
-- TODO
-- Define functions to analyse the data
-- ====================================
-- TODO
-- Create data view(s)
-- ===================
-- TODO
-- Define the doSample function which will be called every sample
-- ==============================================================
gs.doSample = function()
-- TODO compute metrics then update and publish data views
end
-- End of script
This script will run without errors, whether configured via a Gateway or run in standalone test mode, but it won't display a dataview.
Publish a Dataview
The usual way to get output from a Geneos sampler is for it to create and update a dataview. An MDM sampler script can produce zero or more views each containing a number of headlines, rows and columns. Dataviews may be created at any time, but it is recommended that the view(s) be created during script initialisation, and subsequently updated and published each sample.
We will now add code to the skeleton sampler script
above to create and publish a dataview. The changes made to
this script are discussed below. Since
the connection to market data process is yet to be discussed, you
cannot put any market data into the view. This will simply
publish some static table data, with a headline
samples
that counts the number of times
doSample
is called (and so updates each
sample).
-- Create data view(s)
-- ===================
-- Define columns and create view
local cols = { "instrument", "minSpread", "maxSpread", "ticksPerSample", "maxInterval", "tradePrice" }
local view = assert(gs.createView("spreads", cols))
-- Publish initial view data
view.headline.samples = 0
view.row['sample'] = { "our", "data", "will", "appear", "here"}
view.row['dummy'] = { ['ticksPerSample'] = 0 }
assert(view:publish())
Views are created by the createView function, which expects
a view name and a list of column names. If column names are
valid Lua identifiers code that updates views can use a
more concise / readable syntax. In any case, it is strongly
recommended that column names should use only the
alphanumeric characters, underscore, dash and space
[A-Za-z0-9_-]
. View names must be unique within
each sampler, and column names must be non-empty and unique
within the view. Notice the use of the Lua assert
function to check for any error returned by createView
.
The view object returned by
createView
is a Lua table
with member functions to allow it to be published and
refreshed. It also has members called headline and row which are themselves tables
representing the content of the view. A newly created view
has an empty row
table and a single headline
variable named samplingStatus
with a value of
OK
.
A headline can be created or updated simply by assigning
a value to it; it can be deleted by assigning nil
. In
the script above, we add the samples
headline just after creating the view and we update it in
the doSample
method. Notice that because
samples
is a valid identifier, we can
use the short syntax view.headline.samples
. If we had
instead called the headline #samples
, we would have had to write
view.headline['#samples']
.
Rows are added, updated and removed in a very similar
way. However, a row has a value for each column, so the
value associated with a row is also a table. When you
assign a table value to a row, you can use either an array
(as for row['sample']
above) or a mapping (as
for row['dummy']
). If using an array, then
the order of values in the array should match the order of
the column names specified when the view was created (minus
the first column which represents row names). If using a
mapping, the mapping keys should be strings matching the
column names. Notice that, because ticksPerSample
is a valid identifier,
the mapping could have been given using the short syntax:
{ ticksPerSample=0}
.
-- Define the doSample function which will be called every sample
-- ==============================================================
local numSamples = 0
gs.doSample = function()
numSamples = numSamples + 1
view.headline.samples = numSamples
assert(view:publish())
end
The code fragment above will increment a count variable
every sample. This count is displayed in the output
dataview as the value of the samples
headline, which is updated simply by overwriting the entry
with a new value. Updating table cells can be done either
by replacing the entire row with a new table, or updating
specific cell values within the table as required.
View content changes will not immediately be seen by the
Gateway; you must call the publish member function to publish
your changes. It is recommended that the publish
function only be called once per view per sample.
Finally, to make it easier to check the progress of our sampler, we will also add a couple of logMessages at the start and end of our script:
-- Load the API modules used by the script
local gs = require 'geneos.sampler'
gs.logMessage('INFO', "Starting")
gs.logMessage('INFO', "Started OK")
-- End of script
The script now displays and updates (part of) a dataview:
Subscribe to Market Data
To fulfil the primary purpose of a Market Data Monitor plug-in, we need to subscribe to some data. For this tutorial we will use the Example Adapter to provide (random) data, since it is very simple to configure and requires no external data source.
The following code examples have been applied to the publishing dataviews example, to form a new updated script.
Subscribing to market data involves:
- Identifying the feed adapter to be used.
- Providing configuration information specific to that adapter.
- Identifying the instruments to be subscribed to.
- Identifying the fields to be subscribed to for each instrument.
All this information is collected together into a Lua table of 'type' FeedConfig, and is a required parameter when creating a feed. The first part of the configuration tells the Feed Controller which adapter (library) is to be used:
-- Configure and connect to market data sources
-- ============================================
-- Feed configuration is stored in a Lua table
local feedConfiguration = {}
-- Feed Controller configuration
feedConfiguration.feed = {
type = "example",
["library.filename"] = "flm-feed-example.so",
}
The next part sets parameters specific to the feed
adapter. The FeedConfig field name should match the
type
setting in the previous part, in
this case example
. The Example adapter only has
one parameter, the number of milliseconds it pauses (waits
for) between publishing tick updates.
-- Feed adapter configuration
feedConfiguration.example = {
publishingPeriod = 200,
}
The next two parts specify the instruments and fields for the initial set of subscriptions. If no initial subscriptions are required, these fields may be omitted. Each of them maps a set of names used for display by the sampler to a set of codes understood by the feed. This allows verbose codes used by the feed to be mapped to concise names that are understood by the end users; it also allows inconsistent codes used by different feeds to be mapped to a common set of names.
In general, you will need to refer to the documentation for the feed adapter and that of market data vendor to determine the correct instrument and field codes. In the case of the Example feed, the instrument codes used are unimportant; any string value given creates a new instrument subscription. Field codes are again mostly unimportant; there are a few special field codes, but any other code will simply generate values where each is 0.01 greater than the previous value. To make the data slightly more realistic, we order the field codes so that Bid is generated before Trade and Ask after.
-- Define the set of instruments
-- Instrument subscriptions are defined as a mapping from display name to stock code
feedConfiguration.instruments = {
GOOG = "DATA_SERVICE.GOOG",
IBM = "DATA_SERVICE.IBM",
}
-- Define the set of fields
-- Fields are defined as a mapping from display name to field code
feedConfiguration.fields = {
Bid = "01_BID",
Trade = "02_TRADE_PRICE",
Ask = "03_ASK",
}
Having set up the configuration we can now create and start the feed. This is done using the addFeed() function in the geneos.marketdata module, so we must add a reference to this module in our script.
-- Load the API modules used by the script
local gs = require 'geneos.sampler'
local md = require 'geneos.marketdata'
-- Create and start a feed using the above config
local tutorialFeed = assert(md.addFeed("Tutorial-Feed", feedConfiguration))
tutorialFeed:start()
The addFeed() function returns a
Feed object. This a Lua userdata object with
methods to allow the feed to be started and stopped,
subscriptions to instruments to be added or removed and
data to be collected. The name passed when the feed is
defined (Tutorial-Feed
in this tutorial) is used
to identify the feed in log messages issued by the market
data API and by the Feed Controller. It should be unique
among feeds created by the sampler. Note again the use of
assert
to check for any errors when adding the feed.
We can now omit the rows sample
and dummy
when publishing initial view
data, which were added for the "Publishing a Dataview" part
of this tutorial. Instead we create / update a dataview row
per instrument during a doSample
function call, populating the rows with tick data from our
subscriptions.
-- Define the doSample function which will be called every sample
-- ==============================================================
local numSamples = 0
gs.doSample = function()
numSamples = numSamples + 1
view.headline.samples = numSamples
local firstTick, lastTick = tutorialFeed:getTicks('GOOG')
local lastTrade = lastTick and lastTick.field['Trade']
view.row['GOOG'] = { tradePrice = lastTrade }
firstTick, lastTick = tutorialFeed:getTicks('IBM')
lastTrade = lastTick and lastTick.field['Trade']
view.row['IBM'] = { tradePrice = lastTrade }
assert(view:publish())
end
Once a feed has been started, a stream of ticks starts
to build up for each instrument subscription. These ticks
are fetched by calling the getTicks() method of the feed
object. This returns two values: the
first and the last tick received since the last time the
subscription was queried. If there were no ticks, both
these return values will be nil
-
this will generally be the case on the first sample.
The ticks are provided as a linked list, meaning all the
ticks can be accessed by following the links from the
first. We will show this in more detail in the next section
of this tutorial. For now we will just display the 'Trade'
field from the last (i.e. latest) tick of each instrument.
Note that each tick is a Lua table which contains a table
called field which in turn contains the
field values. Note also the use of the Lua and
operator to guard against attempting to access lastTick.field
if the variable is
nil
.
The script now populates one of the columns identified when the dataview is defined:
Analyse the Tick Data
Although it is possible to copy all the values of each tick into a dataview row and use Gateway rules to compute statistics, this would be a poor design choice. The work the Netprobe would do to copy the data would be more than the work needed to analyse it. Similarly, the Gateway would need to process large volumes of row updates before it could begin to run rules on the data. Instead, an MDM sampler should compute the required metrics from the tick data and publish only these metrics to the Gateway.
Because the Example Adapter a simplistic simulator which does not attempt to reproduce live market data, there are limits to the analysis we can usefully do in this tutorial. This section demonstrates the principles involved, which can be extended as required to perform more complex analysis on real input data.
The metrics we will calculate for each instrument are:
- Minimum and maximum spread per sample (calculated as difference between bid and ask).
- Number of ticks received per sample.
- Maximum time interval between ticks.
- Last trade price.
We will also replace our count of samples with a count of the total number of ticks received.
To analyse each tick for an instrument, we will need to
iterate over all the ticks received. Ticks are provided as
a singly-linked-list with a next
field, so to iterate we just need to follow these links
until the next tick is nil
. As
an alternative, we could instead call geneos.marketdata.ticksToArray() and iterate over the ticks in an array.
local tick = tutorialFeed:getTicks('IBM')
while tick do-- TODO use tick data in calculations
tick = tick.next
end
For "per-sample" metrics this loop is sufficient, and we can do all of our calculations in the "TODO" section of the loop above. However, to calculate our example metric "maximum time interval between ticks" we need to store some data between samples. In this case we store the last tick received in each sample, so that we can compare its arrival time with the time on the first new tick, thus computing the time interval.
The loop to iterate over the ticks from the current sample then becomes the following:
local prevTick = nil
function processTicks()
local tick = prevTick or { field = {} }
tick.next = tutorialFeed:getTicks('IBM')
-- TODO Extract required values from 'tick'
-- This will be either the last tick from the previous sample, or an empty sentinel value.
-- Iterate over the ticks
while tick.next do
tick = tick.next
-- TODO Use tick data to calculate metrics
end
-- Store last tick of this sample for next time
prevTick = tick
end
Continuing to update our tutorial script developed in the previous sections, use this script that includes the changes described below.
Define a helper function for rounding computed values to two decimal places for display purposes:
-- Define functions to analyse the data
-- ====================================
-- Utility function to round non-nil (positive) values to two decimal places
local function round2dp(value)
return value and math.floor(value * 100 + 0.5) / 100
end
Next, define a function to perform the analysis of the ticks received in the current sample, for a single instrument.
-- A table to map instrument names to the last tick seen for each instrument
local prevTick = {}
-- A function to process the sample data for a specific instrument.
-- Takes a single string argument; the instrument name.
local function processTicks(instName)
local tick = prevTick[instName] or { field = {} }
tick.next = tutorialFeed:getTicks(instName)
-- Extract required values from 'tick'
local prevTime = tick.timeLast
-- Metrics we will compute from the tick data
local tickCount, maxInterval, minSpread, maxSpread
tickCount = 0
-- Iterate over the ticks
while tick.next do
tick = tick.next
-- Use tick data to calculate metrics
tickCount = tickCount + 1
local interval = prevTime and tick.timeFirst - prevTime
if not maxInterval or interval > maxInterval then maxInterval = interval end
prevTime = tick.timeLast
local spread = tick.field.Ask - tick.field.Bid
if not minSpread or spread < minSpread then minSpread = spread end
if not maxSpread or spread > maxSpread then maxSpread = spread end
end
prevTick[instName] = tick
-- Create a table of row values.
-- We will place this result directly into a view, so the field names must match the column names.
local rowValues = {
ticksPerSample = tickCount,
maxInterval = round2dp(maxInterval),
minSpread = round2dp(minSpread),
maxSpread = round2dp(maxSpread),
tradePrice = tick.field.Trade -- Trade price is taken from the last tick seen
}
-- Return two values; (1) a table of row values and (2) the count of new ticks
return rowValues, tickCount
end
This function follows the tick iteration loop which
keeps the last tick from the previous sample. The example
is extended slightly to take an argument specifying which
instrument to process. This means we must also change the
tick storage to store the last tick on a per-instrument
basis, and so prevTick
becomes a map instead of a
single value.
The calculations in the body of the while loop generate the metrics which will be published to the view for that instrument. After iterating over all ticks, the row for the view is generated and returned to the caller along with the number of ticks processed.
The functions defined so far take care of computing
metrics at the level of each instrument subscription. We
also want to show how summary statistics can be calculated
and displayed, so we replace the samples
headline with one to show totalTicks
:
-- Publish initial view data
view.headline.totalTicks = 0
assert(view:publish())
Finally, we modify the doSample
function to call the analysis function processTicks
for each subscribed
instrument. The first result returned by processTicks
is applied to the dataview
row; the second is used to compute the value for the
totalTicks
headline.
-- Define the doSample function which will be called every sample
-- ==============================================================
local totalTicks = 0
gs.doSample = function()
for name, _ in pairs(feedConfiguration.instruments) do
-- Loop over each subscribed instrumentlocal rowResult, tickCount = processTicks(name)
-- processTicks() returns 2 values
view.row[name] = rowResult
-- update view row for this instrument with new metrics
totalTicks = totalTicks + tickCount
-- update tally of total ticks observed
end
view.headline.totalTicks = totalTicks
assert(view:publish())
end
The script now populates all the columns in the dataview:
Settings and Properties
The MDM sampler API allows custom parameters to be configured as settings in the Gateway Setup Editor, which will then be passed down to the script when the sampler is executed. These parameters allow users to vary the behaviour of the script without changing the content of the script itself.
In addition to parameters, the script is also passed several fields which identify the sampler being executed (i.e. the sampler name, managed entity name and entity type). These properties are useful for logging, or for generating filenames or other identifiers which indicate the source of the data output.
In this section of the tutorial, we will read the
sampler name property and define settings for the set of
instrument names and for the publishingPeriod
parameter passed to
the Example feeds.
First, we use the name properties to build a name string which we log as an info message:
-- Report the sampler managed entity and name
local fullName = gs.entityName .. '.' .. gs.name
gs.logMessage('INFO', "Sampler is ", fullName)
The parameter settings defined in the Gateway Setup file (or on the Netprobe command line if using test mode) are passed to the script in the parameters table. Parameters are defined as key-value pairs of type string. It is generally good practice to write any configuration parameters out to the Netprobe log, to help future maintenance and debugging:
-- Log all configuration parameters (if any)
for key,value in pairs(gs.params) do
gs.logMessage('INFO', 'Parameter "', key, '"="', value, '"')
end
Next, check for two specific parameters:
-- Get the 'publishingPeriod' parameter passed to the example feed
-- The parameter is passed as a string, which we must convert to a number.
-- If conversion fails or the parameter is not specified, this will default to 200.
local periodParam = tonumber(gs.params['publishingPeriod']) or 200
-- Get instrument names or default to "GOOG,IBM"
local instrumentsParam = gs.params['instruments'] or "GOOG,IBM"
local instrumentMap = {}
for inst in string.gmatch(instrumentsParam, "[^,]+") do
instrumentMap[inst] = string.format("DATA_SERVICE.%s", inst)
end
The publishingPeriod
parameter is passed as
a string, and must be interpreted as a number using the Lua
tonumber
function. The or
operator is used to provide a
default value for this parameter in case it is either not
specified, or not a number.
The instruments
parameter takes a
comma-separated list of instrument names. This list is
split on each comma by the string.gmatch function. A mapping is then built up from
the instrument name, to a stock code consisting of the name
prefixed by the string "DATA_SERVICE."
.
Now that the parameters have been parsed, we can substitute these variables in place of the hard-coded values we previously used:
-- Feed adapter configuration
feedConfiguration.example = {
publishingPeriod = periodParam,
}
-- Define the set of instruments
feedConfiguration.instruments = instrumentMap
We can make the script more portable by using the file separator to tell whether we are on Windows or Unix:
-- Detect the C library extension based on the platform (detected using the file separator character)
local clibExt = (package.config:sub(1,1) == "/") and ".so" or ".dll"
-- Feed Controller configuration
feedConfiguration.feed = {
type = "example",
["library.filename"] = "flm-feed-example" .. clibExt,
}
Finally, we can make the feed configuration more compact by reducing everything down to a single definition:
-- Detect the C library extension based on the platform
local clibExt = (package.config:sub(1,1) == "/") and ".so" or ".dll"
-- Feed configuration is stored in a Lua table
local feedConfiguration = {
feed = {
type = "example",
["library.filename"] = "flm-feed-example" .. clibExt,
},
example = {
publishingPeriod = periodParam,
},
instruments = instrumentMap,
fields = {
Bid = "01_BID",
Trade = "02_TRADE_PRICE",
Ask = "03_ASK",
},
}
The script
now accepts parameters
entered via the GSE:
Running the sampler, we can see there are fewer ticks received per sample and the configured instrument names are different:
You can also run this script from the Netprobe command line with parameters passed as additional arguments:
: ./netprobe.linux -mdmtest code_examples/tutorial-step5.lua publishingPeriod=600 instruments=APPL,GOOG,ORCL
The parameters are logged to the output:
<Tue Nov 26 09:32:46> INFO: mdmTest Reading key file 'mdm_appkey.txt', contents listed below.
<Tue Nov 26 09:32:46> INFO: mdmTest Client : ITRS Group
<Tue Nov 26 09:32:46> INFO: mdmTest Contact : Doc-Generator
<Tue Nov 26 09:32:46> INFO: mdmTest Test mode : Enabled
<Tue Nov 26 09:32:46> INFO: mdmTest Gateway mode: Enabled
<Tue Nov 26 09:32:46> INFO: mdmTest Starting
<Tue Nov 26 09:32:46> INFO: mdmTest Parameter "publishingPeriod"="600"
<Tue Nov 26 09:32:46> INFO: mdmTest Parameter "instruments"="APPL,GOOG,ORCL"
<Tue Nov 26 09:32:46> INFO: FeedController Create feed request for 'Tutorial-Feed' from plugin 'mdmTest'.
...
Commands
As a final step, we will add a user command to our sampler, to reset the 'totalTicks' headline. To do this we need to use the geneos.sampler.commands module in our script:
-- Load the API modules used by the script
local gs = require 'geneos.sampler'
local md = require 'geneos.marketdata'
local cmd = require 'geneos.sampler.commands'
Commands are described by a definition object, which
specifies properties such as where the command is available
what what arguments (if any) it requires. The newDefinition
function will create
empty definition objects. This definition is then be
populated by calling the object methods. In this case, we
specify that the command applies to the named headline
variable using the addHeadlineTarget
.
-- Define a command
-- The target specifies the command will appear on the 'totalTicks' headline variable
local cmdDef = cmd.newDefinition()
:addHeadlineTarget(view, "totalTicks")
When a command is executed by a user, the sampler API will run the callback function associated with the command. We define the callback function for our example command below. In this case, the function will reset the value of the totalTicks variable, and publish a dataview update with the new headline value.
-- Define a function which implements the command
local resetTotalTicks = function(target,args)
totalTicks = 0
view.headline.totalTicks = totalTicks
assert(view:publish())
end
The publishCommand
function uses these
two components to publish a new command.
-- Publish the command
assert(gs.publishCommand(
"Reset Tick Count",
-- Name of the command
resetTotalTicks,
-- Function to execute
cmdDef
-- The command definition
))
Note that the name supplied here will define how the command appears to the user. A longer, unique name is used by the Gateway for the command and appears in the Netprobe log when the command is executed. This name is used when permissioning or scheduling the command in the Gateway configuration.
Our command is now available in the headline's context menu:
We have now taken the script
as far as is reasonable
using the Example adapter as input. It could usefully be
reconfigured to connect to a live data source, but this is
outside the scope of this tutorial.
MDM How-To
The sections below describe several different code examples for populating and publishing dataviews.
Create Views
New dataviews are created using the createView
function. Views may be
created at any time, but it is recommended that views be
created during script initialisation where possible.
Multiple (independent) views can be created, but each one
requires a unique name within the MDM sampler.
When modifying a view object, any changes made will not
be published until the view:publish()
function is
called.
-- Create a view
local cols = {"rowName", "mul10", "mod10"}
local view = assert(gs.createView("MyView", cols))
-- Another view sharing the same columns
local sameColsView = assert(gs.createView("SameColumns", cols))
-- Another view with (different) columns declared inline
local subrowsView = assert(gs.createView("subrows", {"row","col1", "col2", "col3"}))
The column names array may be shared between views, or declared inline. Remember that the first entry in this array is the row name, and so will not used when adding row values.
Create, update and remove headline variables
Headline variables are represented as entries in the
view's headline
table, mapping from
headline name to value. Each dataview is created with a
samplingStatus
headline already
present, with an initial value of OK
. This
headline cannot be removed from the view; attempting to do
so will only clear the headline value.
-- Creating, updating and removing headlines
view.headline["status"] = "connected"
-- Just assign to a headline to create it
view.headline["total"] = 0
view.headline["total"] = 13
-- To update a headline, just assign a new valueview.headline["total"] = nil
-- Assigning the value nil will remove the headline.
view.headline = {}
-- Remove all headlines by assigning the empty table.
To clear a headline without removing it, assign the
empty string ""
.
Short-hand syntax (Lua syntactic sugar)
The Lua language comes with a short-hand syntax (or syntactic sugar) for common Lua table operations.
-- Lua syntactic sugar for table operations
-- If the table key strings are valid Lua identifiers, they can be used directly (not as strings).
view.headline.total = 17.4
-- This also applies to table initialisation
-- Long form:
view.headline = { ["total"]=22.8, ["status"]="disconnected" }
-- Short form:
iew.headline = { total=22.8, status="disconnected" }
This syntax can only be used for names that are valid Lua identifiers, so names with spaces or other characters (e.g. "percent CPU" or "ticks/sec") will need to use the long syntax.
Names which are defined within a variable will also need to use the long syntax, passing the variable name instead of a string.
local name = "status"
view.headline[name] = "new status string"
-- equivalent to
view.headline["status"] = "new status string"
-- not
view.headline.name = "string"
-- creates a headline called 'name'
Create, update and remove rows
Working with dataview rows is very similar to working
for headlines. Rows are stored in the view's row
table, mapping from the row
name to a table of values.
The value of a row is a table of column values for that row. This table can be specified either as an array (where index 1 is the value for the first 'value' column) or as a mapping (where the mapping names must match the column names). Note that the first column name specified when creating a view is the row name, and is not used for values.
-- Creating, updating and removing table rows
-- Working with rows works as for headlines, except that rows must be a table of values.
-- Rows can be either an array:
-- (with values defined in the same order as column names)
view.row.vals = { "value 1", "value 2" }
-- Or a mapping:
-- (from column name to cell value)
view.row.static = { mul10="count * 10", mod10="c % 10" }
view.row.vals = nil
-- remove a row
view.row.static.mod10 = "count % 10"
-- replace a specific column value
To remove all rows at once, simply assign the row table to an empty table (as per headlines above). You may want to do this if you are regenerating a dataview each sample from another data source, and your set of rows may change each sample.
Sub-rows or indented rows
Geneos UIs will by convention display rows containing a # character indented from the other rows in a dataview. Such rows are sometimes referred to as "sub-rows", and are typically used for displaying additional details for the row they are displayed under. Example usage by other Geneos plug-ins include the "summary+details" view of the PROCESSES plug-in, or physical/logical rows of the CPU plug-in.
To use sub-rows in your own plug-in, simply name the row
in the form topRow#detail
. Metrics for the sub row
should typically display detailed information, while the
top (or summary) row displays a summary or aggregate of the
sub rows. For example, a "total" or "average" column would
display the total or average of the figures displayed
beneath it.
-- Sub-rows are displayed 'indented' in Geneos UI
-- This convention is typically used to display "summary" and "detail" data in the same view.
-- The top-level row shows the summary or aggregate data.
subrowsView.row["top"] = { 8, 7, 4 }
subrowsView.row["top#sub1"] = { 2, 7, 3 }
subrowsView.row["top#sub2"] = { 6, 0, 1 }
subrowsView.row["head"] = { 1, 0, 0 }
subrowsView.row["head#sub1"] = { 1, 0, 0 }
assert(subrowsView:publish())
The code above produces the following dataview:
Publishing Commands
The sections below describe several different usage examples for the commands API, with code examples. These examples can be used as a starting point for developing more specific interactions.
A script containing the examples below is available here
.
These examples assume that the geneos.sampler.commands package has been imported:
local gs = require "geneos.sampler"
local cmd = require "geneos.sampler.commands"
Publish a command for a headline
This example shows how to publish a command that applies
to a specific headline, using the addHeadlineTarget
function. The command definition is created separately to
the publish function call, but could easily be declared
inline like the command callback function.
-- A command definition that applies to the 'count' headline of view 'v'.
local h1 = cmd.newDefinition():addHeadlineTarget(v, "count")
-- Add the command 'Reset Count' using this command definition.
assert(gs.publishCommand(
"Reset Count",
-- The "short" command name, appears in the UI menu.
function(target, args)
-- The function to be executed is defined inline.
count = 0
v.headline.count = count
-- Reset the headline count back to 0.
assert(v:publish())
end,
h1
-- Pass the command target definition.
))
The callback function interacts with the published dataview, resetting the sample counter headline to 0. The view is then published after the modifications are complete, which makes the command feel more responsive. If the call to publish was omitted, the view typically will not update with the new value until the next sample.
Since the command applies to only a single target, there
is no need to check the target
parameter. Similarly, as this command takes no arguments
the args
parameter can be ignored.
When this command is successfully published, it appears in the user interface as follows:
Publish a command for a row (or column)
Commands can be defined for a dataview row or column
using the addRowTarget
or
addColumnTarget
functions respectively. These targets will allow the
command to be displayed for any dataview cell in the
specified row or column. The functions differ slightly in
that the target rows can be specified by a wildcard
pattern, whereas a column name must be configured
exactly.
The example below defines a command which applies to
rows matching the pattern *#*
(i.e. rows whose name contains a # character). This pattern
therefore matches all sub-rows
in a Geneos dataview. Executing this command will remove
the row corresponding to the cell that the user clicked on,
by referencing the target.row
parameter passed to the
callback function.
-- Add a command on all rows matching the given pattern
assert(gs.publishCommand(
"Remove Row",
function(target, args)
v.row[target.row] = nil -- Remove the row the user clicked on
assert(v:publish())
end,
-- Command applies to all 'subrows' (i.e. rows whose name contains a # character)
cmd.newDefinition():addRowTarget(v, "*#*")
))
Nesting commands in sub-menus
By default commands are displayed at the top level of the context menu, with the (short) command name forming the menu text. The menu path property of a command definition can override this location, allowing commands to be nested inside sub-menus. Commands sharing the same menu path prefix will be displayed as siblings in the sub-menu.
-- A command with a custom menu path
-- This only affects how the command appears in the UI, the command name remains unchanged
assert(gs.publishCommand(
"Do Nothing Column Command",
function(target, args) --[[ Does nothing ]] end,
cmd.newDefinition()
:setMenuPath( {"Nested","Menu","Command"} )
:addColumnTarget(v, "Value")
))
The command is accessed in the user interface via the overridden path.
Request user-input
Commands may request user-input by specifying arguments as part of the command definition. When a command with arguments is executed, a dialog is presented to the user prompting them to enter the argument values. Several different types of arguments can be specified, and each can optionally define a label (describing the argument) and a default value.
The example below shows a command definition which specifies several arguments:
-- A multi-argument command, defined using the fluent interface
local argsFluent = cmd.newDefinition()
:setDescription("An example of various command argument types.")
-- Description appears at the top of user input dialog
:addStringArgument("myString", "My String Default")
-- String argument, with default
:addIntegerArgument("myInteger", 123456)
-- Integer argument, with default
:addNumericArgument("myDouble", 123.456)
-- Double arguemnt, with default
:addBooleanArgument("myBoolean", true)
-- Boolean argument, with default
:addChoiceArgument("myChoice", {"Red", "Yellow", "Green", "Blue"})
-- Choice argument, with 4 options specified (first is default)
:addRowTarget(v, '*')
-- Applies to all rows
Commands may instead be defined in a tabular style. Which style is used is a matter of preference, although the fluent interface provides additional argument validation.
-- The same command definition, in a tabular style
local argsTabular = {
description = "An example of various command argument types.",
args = {
{ type="string", description="myString", default="My String Default" },
{ type="integer", description="myInteger", default=123456 },
{ type="number", description="myDouble", default=123.456 },
{ type="boolean", description="myBoolean", default=true },
{ type="choice", description="myChoice", options={"Red", "Yellow", "Green", "Blue"} }
},
targets = {
{ type="row", view="myView", name="*" }
}
}
When executed, the command will display the following dialog to the user.
If the user executes the command (by clicking
OK
) the supplied callback function will
be called. The command arguments can be accessed via the
args
parameter, which is an array of
argument values.
The order of the values will match the order in which the arguments were added in the command definition. In the example below, when the command is executed we replace the current dataview contents with the returned target and arguments.
-- Shows the arguments of the callback function in the view when executed
assert(gs.publishCommand(
"Show Callback Args",
function(target, args)
v.row = {}
-- Add all target parameters to the view, one row per parameter
for key,value in pairs(target) do
v.row[string.format("target.%s", key)] = { tostring(value), type(value) }
end
-- All all arguments to the view, one row per argument value
for i,value in ipairs(args) do
v.row[string.format("args[%d]", i)] = { tostring(value), type(value) }
end
assert(v:publish())
end,
argsTabular
))
If this command is executed on the top-right cell "first row.Type", the result is as follows:
Return output from a command
Commands can produce output by returning a string from the callback function. This feature is typically used to provide additional details for a particular row or cell. Output will be displayed to the user that executed the command. If the callback function produces a Lua error, the error message will both logged to the Netprobe log and returned to the user.
-- A command which outputs a message
-- Since no targets are specified, this command is only accessible from the dataview context menu
assert(gs.publishCommand(
"Output Message",
function(target, args) return "This is a\nmulti\nline\nmessage" end
))
When executed in Active Console, this command displays the following output:
Extend command definitions
Command definitions can both be re-used and extended as
required. To re-use a definition, it can simply be passed
to multiple calls of publishCommand
(with a different
command name each time). The resulting commands will then
share the same properties contained in the definition, such
as the arguments or description.
Definition objects may also be extended, creating a new updated definition while leaving the original unchanged. This can be used (for example) to define commands that share a common set of arguments or targets, then add additional command-specific arguments / targets.
An example of extending a definition object is shown
below. This example shows a base definition baseDef
extended twice, once for each command.
-- Base command definition
local baseDef = cmd.newDefinition():addStringArgument("Name")
assert(gs.publishCommand(
"Say Hello",
function(target, args) return "Hello " .. args[1] end,
-- Derived definition, with altered description
baseDef:new():setDescription("Say hello to ...")
))
assert(gs.publishCommand(
"Say Goodbye",
function(target, args) return "Goodbye " .. args[1] end,
-- Derived definition, with altered description
baseDef:new():setDescription("Say goodbye to ...")
))
Subscribe to market data
The following example is a ticker style application that subscribes to data and prints the results to stdout. It shows how to subscribe to data using the Example adapter, which is a synthetic data adapter. This means the example can be run as a standalone example without the need for market data connectivity and therefore be used in a development environment. Check the inline code comments for additional description.
Full source code available here
.
To use the market data API, the package has to be imported:
-- Import packages
local gs = require "geneos.sampler"
local md = require "geneos.marketdata"
Configure a feed
The feedconfiguration
structure contains
all the feed connection data and specifies the instruments
and fields which will be subscribed. Feed connection data
has both generic and feed specific configuration.
-- Setup feed configuration
local feedConfiguration = {}
-- Feed configuration is a Lua table
feedConfiguration.feed = {
-- Generic feed configuration
type = "example",
["library.filename"] = "flm-feed-example.so",
-- verbose syntax due to '.' character in lib file
verbose = "false",
-- log verbosity
}
feedConfiguration.example = {
-- feed specific configuration
publishingPeriod = 1000
-- Defines period between published tick updates
}
feedConfiguration.instruments = {
-- define the set of instruments to subscribe to
GOOG = "DATA_SERVICE.GOOG",
-- map of names to instrument codes
IBM = "DATA_SERVICE.IBM"
}
feedConfiguration.fields = {
-- define the set of fields to subscribe to
Trade = "TRADE_PRICE",
-- map of names to field codes
Bid = "BID",
Ask = "ASK" ,
Time = "QUOTIM"
}
-- Create and start the feed
local exampleFeed = md.addFeed("Example-Feed", feedConfiguration) -- Create a local feed using config from above
exampleFeed:start()
-- Start the feed
-- Function to unsubscribe from one instrument and subscribe to another
local function replaceSubscription(removeInst, addInst)
if removeInst then
exampleFeed:unsubscribe(removeInst)
end
if addInst then
exampleFeed:subscribe(addInst, "DATA_SERVICE."..addInst)
end
end
-- Utility function to print tick content
local function printTick(tick)
print("-----------------------")
-- Each tick is a table
print("Got tick for: " .. tick.inst)
print("Trade " .. tick.field.Trade)
-- Subscribed fields appear
print("Ask " .. tick.field.Ask)
-- in a nested table called 'field'
print("Time " .. tick.field.Time)
print("Bid " .. tick.field.Bid)
end
-- Print this sample's ticks for given instrument
local function printTickStream(instName)
local ticks = exampleFeed:getTicks(instName)
while ticks do
printTick(ticks)
ticks = ticks.next
-- ticks are returned as a linked list
end
end
-- doSample() is called periodically by the Netprobe
local count = 0
local otherInst = "IBM"
gs.doSample = function()
printTickStream("GOOG")
printTickStream(otherInst)
count = count + 1
if count == 3 then
replaceSubscription("IBM", "AAPL")
otherInst = "AAPL"
end
end
It is often useful to populate parts of this structure
from sampler configuration parameters. The tutorial gives
examples
of this, including building up the instruments
section by inserting names
into a template.
The configuration is applied by creating a feed instance
and calling geneos.marketdata.Feed:start()
on
it:
-- Create and start the feed
local exampleFeed = md.addFeed("Example-Feed", feedConfiguration) -- Create a local feed using config from above
exampleFeed:start()
-- Start the feed
Debug a feed configuration
It is often useful to test the feed configuration in a minimal sampler that can be run stand-alone, as in this example. This allows the data connection to be tested without trying to debug the display logic at the same time.
When this example is run in standalone mode, it produces the following output:
$ ./netprobe.linux_64 -mdmtest code_examples/feed_subscription.lua
...
-----------------------
Got tick for: GOOG
Trade 55.08
Ask 55.05
Time 55.07
Bid 55.06
-----------------------
Got tick for: IBM
Trade 55.04
Ask 55.01
Time 55.03
Bid 55.02
..
Change subscriptions
It is possible to subscribe to additional instruments
and unsubscribe from existing ones without restarting the
feed. When adding a new subscription, you can specify a
different set of fields as an optional parameter to
geneos.marketdata.Feed:subscribe()
.
The example below uses the existing set of fields, that is,
the ones in the feed configuration passed to addFeed()
.
-- Function to unsubscribe from one instrument and subscribe to another local function replaceSubscription(removeInst, addInst)
if removeInst then
exampleFeed:unsubscribe(removeInst)
end
if addInst then
exampleFeed:subscribe(addInst, "DATA_SERVICE."..addInst)
end
end
Process tick data
The geneos.marketdata.Feed:getTicks()
function is used to collect the ticks received since the
last sample. The tutorial explains how to construct a
processing
loop that allows each tick to be compared with the
previous one, including the last tick from the previous
sample.
This example simply prints all ticks to stdout via a utility function.
-- Utility function to print tick content
local function printTick(tick)
print("-----------------------")
-- Each tick is a table
print("Got tick for: " .. tick.inst)
print("Trade " .. tick.field.Trade)
-- Subscribed fields appear
print("Ask " .. tick.field.Ask)
-- in a nested table called 'field'
print("Time " .. tick.field.Time)
print("Bid " .. tick.field.Bid)
end
-- Print this sample's ticks for given instrument local function printTickStream(instName)
local ticks = exampleFeed:getTicks(instName)
while ticks do
printTick(ticks)
ticks = ticks.next
-- ticks are returned as a linked list
end
end
-- doSample() is called periodically by the Netprobe
local count = 0
local otherInst = "IBM"
gs.doSample = function()
printTickStream("GOOG")
printTickStream(otherInst)
count = count + 1
if count == 3 then
replaceSubscription("IBM", "AAPL")
otherInst = "AAPL"
end
end
Customise the tick remarks during matching
By default, during execution of the LCS algorithm, once
a match is found, the function onMatch
is called to process the
match.
This function adds certain tick remarks when a match is found. This field follows specific formats documented at:
geneos.marketdata.Tick.remark
Tick remarks are generated by concatenating the <MATCH TEXT> with the tick sequence number. This is a relatively expensive operation in Lua, particularly since it is not supported by the just-in-time compilation in LuaJIT; if there are a very large number of ticks and most of them match, it will significantly affect performance
A user can supply their own onMatch() function to override the built-in one. This allows the user to create their own custom remarks, or to disable the concatenation to improve the performance drastically.
The user must also supply a custom writer function to Context:addWriter() that works with their custom onMatch() function.
(The writer functions returned by the objects created by
the newMatchedTicksFile
and newTickHistoryFile
methods will not
work with the onMatch function shown below.)
The following code snippet provides an example of this:
local lat = require "geneos.latency"
...
-- Custom logging function compatible with the custom onMatch() function
local function tickLogger(event, tick)
if (tick and not tick.remark) then
--[[process unmatched tick]]--
end
end
-- Custom onMatch() function
function onMatch(self, latency, nm, bt, ft)
local max = math.max
local min = math.min
-- Document number of matches
self.matches = self.matches + 1
-- Save latency details
self.latSum = self.latSum + latency
self.latSumSq = self.latSumSq + (latency * latency)
self.latMin = self.latMin and min(self.latMin, latency) or latency
self.latMax = self.latMax and max(self.latMax, latency) or latency
-- Change remark format to remove text and just leave sequence numbers/latency
if (bt.seq) then
ft.remark = bt.seq
local matchedBy = ft.seq
if not bt.remark or bt.remark < matchedBy then bt.remark = matchedBy end
else
ft.remark = latency
bt.remark = latency
end
end
--- Create new context
local ctx = lat.newContext{ ... }
...
-- Add tickLogger() as a writer function
ctx:addWriter(tickLogger)
...
--- Register the base feed
ctx:setBase(...)
--- Register the alt feed
ctx:addFeed(...)-- Since this is the first alt feed, change the onMatch() function of feeds[1]
ctx.feeds[1].metrics.onMatch = onMatch
Run the latency algorithm
The geneos.latency module manages two or more feed configurations, acting as a wrapper over the market data module. It calculates the relative latency between one of a baseline feed and each of the other feeds. To use the module, you create a latency context, provide it with two or more feed configurations and use its methods to obtain the resulting latency metrics.
The geneos.latency.files companion module provides support for logging the ticks that are passed into the algorithm and the matches between ticks that provide the basis for the metrics.
The following code example demonstrates how to use both of these modules. It uses the RFA adapter to connect two different RFA sessions, subscribing to the same instruments on each. On each sample, the algorithm is run and summary results are diplayed.
Full source code is available here
.
The first step is to import the required geneos packages:
-- Import packages
local gs = require "geneos.sampler"
local lt = require "geneos.latency"
local lf = require "geneos.latency.files"
Customise the latency algorithm
The built in latency matching algorithm can be adjusted to suit particular requirements. This guide explains some of the options available and how to enable them in Lua code. This guide assumes a working latency matching configuration, please see Running the latency algorithm for details on how to achieve this.
The latency algorithm uses tick matching functions to match ticks from the base feed against each comparison feed. That is, if there is a baseline feed, "Base" and two comparison feeds, "Feed_A" and "Feed_B", it will use one function to match ticks from "Base" against "Feed_A" and another to match "Base" against "Feed_B".
By default, these functions compare each field as a number, with a tolerance of 0.000001, stopping when they find a field that does not match. They also apply a time threshold: if the difference between the arrival time of two ticks exceeds 2 seconds, the ticks are considered not to match. Each function only compares fields that are configured for both feeds; that is, if a field is configured for the "Base" feed and for "Feed_A", but not for "Feed_B", the function that matches "Base" against "Feed_A" considers it, but the function that matches "Base" against "Feed_B" ignores it.
To customise the matching functions, you can override
the default values for the field tolerances and time
thresholds when you create the latency context.
Alternatively, you can explicitly specify a comparison
function each time you call addFeed
. If you use an explicit
comparison function, you can use getTickComparator
to create it for
you, or you can code a function from scratch to implement
the geneos.latency.LcsComparator()
specification.
Feed configuration
The latency module uses the same feedconfiguration
structure as the
market data module.
Because the same sets of instrument and field names will be used for each feed, it is useful to factor out these out into separate tables that can be used by the feed configurations. The Emulating the FLM plug-in using MDM example illustrates this in more detail.
-- Define fields and instruments
local fields = { Trade="TRDPRC_1", Change="NETCHNG_1", Bid="BID", BidSize="BIDSIZE", Ask="ASK", AskSize="ASKSIZE" }
local insts = { Bund="IDN_SELECTFEED.Bund", DAX="IDN_SELECTFEED.DAX", Eurostoxx="IDN_SELECTFEED.Eurostoxx" }
-- Using an RFA base data source
local rfaBase = {
feed = { type = "rfa", ["library.filename"] = "flm-feed-rfa.so"},
rfa = { configFile = "RFA.cfg", session = "OMMFeedBase", connectionType = "OMM" },
instruments = insts,
fields = fields
}
-- Using an RFA alt data source
local rfaAlt =
{
feed = { type = "rfa", ["library.filename"] = "flm-feed-rfa.so"},
rfa = { configFile = "RFA.cfg", session = "OMMFeedAlt", connectionType = "OMM" },
instruments = insts,
fields = fields
}
To run this code you should modify the feed, instrument and field settings to match your data source. See Feed Adapters for the applicable settings for your feed type.
Set up the latency context
To use the latency module, instead of calling geneos.marketdata.addFeed()
for
each feed configuration, you create a latency context and
add feeds to it. This involves calling Context:setBase()
exactly once, for
the baseline feed, and calling Context:addFeed()
for each feed
that is to be compared to the baseline. At this stage, it
is possible to customise the way ticks are matched and the
statistics that are calculated as matches are detected.
This is covered in the Customising
the latency algorithm example.
-- Create and configure the latency contextlocal ctx = lt.newContext() -- Create the context object here
:setBase("Base-Eurex", rfaBase)
-- Register the base feed
addFeed("Alt-Eurex", rfaAlt)
-- Add an alternate feed
For this example, we have also configured a tick history logger and a matched tick logger. The tick history logger logs details of every tick in all the subscriptions in a fairly verbose format. It is a useful diagnostic tool, but a production sampler should use a configuration parameter to detemine whether it should be enabled. Alternatively, the log files can be managed by specifying file rollover settings, as described here. The matched tick logger writes one record to a CSV file for each baseline tick that is matched by one or more of the other feeds.
After registering the tick loggers, calling Context:start()
starts the
subscriptions.
-- Create tick loggers and start the context
local historyLogger = lf.newTickHistoryFile(gs.name..".%y%m%d.ticks")
local matchedTickLogger
= lf.newMatchedTicksFile(ctx:getConfiguredNames(), gs.name..".%y%m%d.csv")
ctx:addWriter(historyLogger:getWriter())
-- Register the history logger's tick writer method
:addWriter(matchedTickLogger:getWriter())
-- and the same for the matched tick logger
:start()
-- Start the subscriptions
Display the data
The sampler's doSample()
function must call the Context:sample()
method to collect
the ticks from each subscription and run the latency
algorithm. Context:getMetrics()
is used to
obtain metrics which can be formatted for display. This
example constructs just one dataview and uses only
feed-level summary metrics; the Emulating
the FLM plug-in using MDM example uses metrics at
the instrument level and tick data values to provide
additional dataviews.
-- Create a view to display the data
local view = gs.createView("LATENCY", {"feed", "status", "numTicks", "numMatches", "minLatency", "maxLatency" })
view.headline.baselineFeed = "Base-Eurex"
view:publish()
-- Publish a stats view, similar to FLM Latency view
local function formatMillisec(value)
-- Format nil value as blank string
return value and string.format("%0.1f", value*1000) or ""
end
local count = 0
gs.doSample = function()
count = count + 1
print("[latency ] sample " .. count)
ctx:sample()
-- Process the sample
local mBase = ctx:getMetrics("Base-Eurex")
-- Grab metrics from the base feed
local mAlt = ctx:getMetrics("Alt-Eurex")
-- Grab metrics from the altenate feed
view.row = {}
view.row["Base-Eurex"] = {
ctx:getFeedStatus("Base-Eurex"),
-- Status of the base feed
mBase.numTicks,
-- Add remaining stats to row
mBase.matches,
"",
-- The base feed has no latency stats
""
}
view.row["Alt-Eurex"] = {
ctx:getFeedStatus("Alt-Eurex"),
mAlt.numTicks,
-- Add remaining stats to row
mAlt.matches,
ormatMillisec(mAlt.latMin),
ormatMillisec(mAlt.latMax),
}
view:publish()
-- publish the updated view to Geneos
end
Manage log files
Log files, and especially tick history files, can grow large rather quickly. To manage these files, it is useful to have the history split over a number of files, with a new file being created either after an interval of time or when the current file reaches a certain size. Older files should be deleted or archived once a certain number of files have been created.
The TickLogger base class allows a rollover interval to be set for a logger. When this interval expires (at midnight, by default) the log file is closed and reopened. If you specify date and time formatting codes in the filename of the logger, the new log file will be given a new name. Additionally, you can specify a callback function when you set the roll interval and this function can take actions such as archiving older files.
The following function packages the creation of a tick history logger with the definition of a callback function which limits the number of log files kept:
local function getDailyRollingTickLogger(prefix, keep)
local dirCmd = string.format("ls -1t %s*", prefix)
local function callback()
local cmdOutput = assert(io.popen(dirCmd))
local kept = 0
local prefixLen = prefix:len()
for name in cmdOutput:lines() do
if kept < keep then
kept = kept + 1
gs.logMessage("INFO", "Keeping ", name)
else
assert(os.remove(name))
gs.logMessage("INFO", "Deleting ", name)
end
end
cmdOutput:close()
end
return lf.newTickHistoryFile(prefix.."%Y%m%d"):setFileRoll(nil, nil, callback)
end
The default parameters for roll interval and first roll
time are used and the file name encodes the date, so a new
file will be created each night, at the start of the first
sample after midnight. Just before the new file is created,
the callback function is run. It uses the Unix ls
command to obtain a list of existing log files and deletes
all but the newest two of these.
The function shown above would be used like this:
-- latency context created previously (but not yet started)
ctx:addWriter(getDailyRollingTickLogger("myLogFile.", 2):getWriter())
ctx:start()
The next example is a little more complicated; it manages files based on size and renames saved files using a rolling numeric suffix. The file name does not include date or time formatting codes, so, unless the callback intervenes, the same file is closed and reopened each time the roll interval expires.
The callback checks the file size and, if it exceeds a threshold, does a rolling rename of the saved files. Note that the operating system command used to check the file size changes depending on the host operating system: if this is not Windows, it is assumed to be Unix-like. Also note that the code does not explicitly check for the existence of saved files: it simply checks that, if the remove or rename operation fails, it was because the file did not exist.
local function assertOkOrNotFound(ok, err)
if not ok and not err:find("No such file") then
error(err, 2)
end
end
local function getSizeCappedLogger(sizeLimit, keepFiles, interval)
-- Host is Windows when path separator is backslash
local runningOnWindows = package.config:sub(1,1) == "\\"
local sizeCmdTemplate = runningOnWindows and "for %%I in (%s) do @echo %%~zI"
or "du -b %s | cut -f1"
local function callback(logger)
local actualName = logger:getActualFilename()
if not actualName then return end
local sizeCmd = string.format(sizeCmdTemplate, actualName)
local pipe = assert(io.popen(sizeCmd))
local result = pipe:read('*n')
pipe:close()
gs.logMessage("INFO", "Size of ", actualName, " is now ", result)
if result > sizeLimit then
local rollTemplate = actualName .. ".%d"
local rollTo = rollTemplate:format(keepFiles)
assertOkOrNotFound(os.remove(rollTo))
for n = keepFiles, 1, -1 do
local rollFrom = n > 1 and rollTemplate:format(n-1) or actualName
assertOkOrNotFound(os.rename(rollFrom, rollTo))
rollTo = rollFrom
end
end
end
local logger = lf.newTickHistoryFile(gs.name .. ".tick.history")
return logger:setFileRoll(interval, h.gettimeofday() + interval, callback)
end
This is how the function above would be used to limit file size to 20MB with three saved files, checking file size every ten minutes (600 seconds):
-- latency context created previously (but not yet started)
ctx:addWriter(getSizeCappedLogger(20 * 1024 * 1024, 3, 600):getWriter())
ctx:start()
Specify tolerances for fields
Two fields will be treated as a match if the difference between them is less than or equal to the tolerance value. The default tolerance for a numeric field match is 0.000001. The tolerance can be configured on a field by field basis to allow more or less variation in matching fields.
Create a map of field names to new tolerance values for fields in your feed.
local tolerances = { Bid = 0.001, Ask = 0.001 }
The simplest way to override the default tolerances is to do this for all feeds, when constructing the latency context. In this case, any fields that are not included in the map will have the default tolerance applied.
local lat = require "geneos.latency"
...
local ctx = lat.newContext{ tolerances = { Bid = 0.001, Ask = 0.001 }}
:setBase("BaseFeed", baseConfig)
:addFeed("TolerantFeed", feedConfig)
-- using tolerances customised above
:addFeed("OtherFeed", feedConfig)
-- also uses custom tolerances
:start()
If you want to override the tolerances for only one
feed, create a customised matching function by calling
getTickComparator
. In this case,
any fields not included in the map will be skipped when
comparing ticks; if you want to use the default tolerance,
you must specify it. In the same way, the default time
threshold of two seconds must be specified, if it is to be
applied. Finally, pass in the new matching function when
calling addFeed
on the context.
local lat = require "geneos.latency"
...
local tolerances = { Bid = 0.001, Ask = 0.001, Trade = 0.000001 }
local timeDiffMax = 2.0
local tolerant_match_function = lat.getTickComparator(tolerances, timeDiffMax)
local ctx = lat.newContext()
:setBase("BaseFeed", baseConfig)
:addFeed("TolerantFeed", feedConfig, nil, tolerant_match_function)
:addFeed("OtherFeed", feedConfig)
-- uses default tolerances
:start()
Enable string matching for a field
By default string fields (other than empty strings) will not match. This behaviour can be changed by specifying a tolerance of 0 for the field. A tolerance of 0 makes the matching function use the operator '==' instead of '<=' for comparison, this will correctly match identical strings.
Once again, you can pass a tolerance map when constructing the latency context or create a customised matching function. The following code snippet provides an example of the first method:
local lat = require "geneos.latency"
...
local ctx = lat.newContext{ tolerances = { StringField1 = 0, StringField2 = 0 } }
:setBase("BaseFeed", baseConfig)
:addFeed("TolerantFeed", feedConfig)
:start()
Change time thresholds
Time thresholds set the minimum and maximum difference, considered as signed number between tick arrival times for them to be considered a match. A positive difference imples that the base feed tick arrives before the comparison feed tick, a negative difference implies the opposite. By default the minimum time threshold is set to -2.0 seconds and the maximum to +2.0 seconds.
To override this, the minimum and maximum can either be supplied when constructing the latency context or by creating a customised matching function. If you want to ignore timestamps altogether when matching ticks, you must create a customised matching function and omit both time parameters.
Parameters passed to newContext
apply to all comparison
feeds:
local lat = require "geneos.latency"
...
local ctx = lat.newContext{ timeDiffMin = -0.2, timeDiffMax = 0.5 }
:setBase("BaseFeed", baseConfig)
:addFeed("TimeSenstiveFeed", feedConfig)
-- using thresholds customised above
:addFeed("OtherFeed", otherConfig)
-- also uses custom time thresholds
:start()
If a customised matching function is used, field tolerances must also be specified, even if the default tolerances are required.
local lat = require "geneos.latency"
...
local maxTime = 0.5
local minTime = -0.2
local tolerances = { Bid = 0.000001, Ask = 0.000001, Trade = 0.000001 }
local match_function = lat.getTickComparator(tolerances, maxTime, minTime)
local ctx = lat.newContext()
:setBase("BaseFeed", baseConfig)
:addFeed("TimeSenstiveFeed", feedConfig, nil, match_function)
:addFeed("OtherFeed", otherConfig)
-- uses default time threshold
:start()
If one of the time threshold parameters is omitted when creating a customised matching function, it will default to the negated value of the other. If the value specied for the minimum is greater than the maximum, the values are swapped. If both parameters are omitted, no time threshold will be applied.
Disable sorting for the history writer
By default, the results written by the history writer are sorted chronologically. This sorting operation takes time proportional to the square of the number of tick streams. If there are a large number of feeds, or a large number of instruments, this will significantly affect performance.
Below is a diagram that outlines what happens during tick collection when the history writer is enabled:
The comesBefore() function checks if one tick's timestamp is later than another, before inserting it into the merged list.
If this function is turned off, no more checking is done, resulting in a faster creation of the merged list.
The following code snippet provides an example of this:
local lat = require "geneos.latency"
...
--- Create new context
local ctx = lat.newContext{ ... }
...
-- Add a history writer
ctx:addWriter( ...)
...
--- Disable sorting
ctx.histMgr.comesBefore = function(self) return false
end
Write a feed adapter
This documentation describes the Feed Adapter (data input) API that a shared-object must expose to operate with MDM plug-ins. Use of this API will enable clients to create their own adapter libraries to interface with their own proprietary systems. The document also explains the packaged C++ wrapper API for C++ development, and the Lua Feed API for development of feeds via Lua scripting.
The Geneos Market Data Monitor (MDM) plug-in provides a programmable environment allowing users to create customised market data monitoring solutions.
Input data to MDM plug-in instances are provided by feed adapter library shared-objects (.so files) on UNIX platforms, or dynamic-link libraries (.dll files) on Windows platforms. These software modules translate market data obtained via the use of a direct market data APIs into a normalised format expected by the plug-in.
Audience
This document is intended for users who wish to write a custom feed adapter shared-object (or DLL) to provide an additional data input for the Market Data Monitor plug-in.
Users developing against the C API are expected to understand the C programming language.
Users developing against the C++ API are expected to understand the C++ programming language. They will also need an understanding of the C API that the C++ API is built upon. The packaged C++ example feed code requires the GCC C++ compiler 3.4.6 (or better) and GNU Make 3.80 (or better) to compile. A Visual Studio 2010 solution is also provided for compilation on Windows platforms.
Users developing using the Lua Feed API are expected to understand the Lua programming language.
Package Contents
The API package file contains a single folder
mdmFeedAdapterApiKit
. The contents of
this folder are as follows:
File | Description |
---|---|
/docs | This documentation |
/docs/index.html | The contents page of the documentation. |
/Makefile | Makefile for Linux builds of the C++ api library and example feed. |
/apikit.sln | Microsoft Visual Studio 2010 solution file for Windows builds of the C++ api library and example feed. |
/api/FeedAdapterCApi.h | The entire C API as a single header file. |
/api/*.h, /api/*.cpp | The C++ API class library headers and implementation. |
/example | Example feed adapter implementation using the C++ API. |
/scripts/mdmTest_example.so.lua | MDM sampler script for testing the C++ example feed using the Netprobe stand-alone command-line test mode. |
/scripts/mdmTest_geneos-feed-lua.so.lua | MDM sampler script for testing a Lua feed (input) script, executed using the ITRS Lua feed adapter "geneos-feed-lua.so". |
/scripts/inputFeed.lua | Example Lua feed script, used by the above MDM sampler script. |
API Interface
Connectivity to each type of market data system is
provided by a separate feed adapter library (.so
or
.dll
file). Communication with each
library (including loading the library) is managed by an
internal component of Netprobe called the Feed Controller.
This component also handles combining requests and
distributing ticks for multiple plug-in instances.
Each feed adapter library must export the functions defined by the Feed Adapter C API. These functions must be present for the Feed Controller to load the library, and are used to control and communicate with the adapter library. The functions fall into 5 main categories:
- Library management - initialisation and un-initialisation.
- Feed management - creation and destruction.
- Instrument management - subscription and un-subscription.
- Reporting - errors and status updates.
- Publishing - tick data.
Architecture
The diagram above displays the flow of tick data through the system, provided via 3 different feed implementations. The blue boxes (with white text) indicate modules to be written by the developer. These modules will communicate with a market data system using the vendor-provided API, and transform the data as required to a format suitable for input to Netprobe.
C Feed Adapter
A C feed adapter is a shared-object written in C that exports the functions defined by the Feed Adapter C API.
C++ Feed Adapter
A C++ feed adapter is a shared-object that still exposes the functions from the Feed Adapter C API, but where the code internally is written in C++.
The API Kit provides source code for an optional C++ "api" library, which wraps the C API in an object-orientated interface, and provides some common utility methods useful for feed adapter development. This package also provides source code for an example C++ feed implementation, which is discussed later in this document.
Lua Feed Adapter
The Lua feed adapter library
geneos-feed-lua.so
is a standard
library provided with the Netprobe package. This
adapter provides a Lua version of the Feed Adapter C
API, allowing execution of a Lua script to provide tick
data. Lua scripting allows rapid development of new
feed adapters without the need to compile code.
In the diagram, the script file
inputFeed.lua
is being executed to
provide feed data. This example script is provided as
part of the API Kit package and is discussed later in
this document.
Feed Adapter interaction with Netprobe
Each feed adapter library is loaded once only, and used
to host all feeds configured for that adapter type. For
example, multiple feeds connecting to different TREP components may be configured, but the
flm-feed-rfa.so
adapter library will
only be loaded by Netprobe once. If the configuration is
changed to remove all feeds for a given adapter type, the
feed adapter library will be unloaded from the Netprobe
process.
Changes in feed configuration from plug-in instances are processed by the Feed Controller and presented to each affected feed adapter library as a series of API calls. The exact calls will vary according to the setup changes made, but typically consist of instruments being added to (or removed from) a feed instance.
Feed adapters should push tick data to the Netprobe on arrival, via the provided function calls. In particular, the feed adapters should not wait until an API call is made from the Netprobe to provide data. As such it is likely that feed adapter library code (or the direct market data API) will need to execute in a separate thread. Input data will be buffered by Netprobe (up to a limit) until the plug-in performs a sample, at which point it is processed and consumed.
If multiple MDM plug-ins are configured, plug-in instances will share the loaded feed adapter libraries; each library will still be loaded once only. Moreover if feed definitions in different plug-in instances are identical, then the MDM plug-ins will share the feed between them. This behaviour ensures that the Netprobe presents the minimum number of connections to each market data system.
By implementing the API interface, your feed adapter library will support these behaviours by default. No special handling for these conditions is required.
Warning:
The feed adapter library is a shared-library, which means it shares the same memory address-space within the Netprobe. As such, system calls such as exit, abort, terminate will end the Netprobe process and thus should be avoided. Additionally, coding errors such as dereferencing invalid pointers, memory leaks or file handle leaks can all negatively affect the Netprobe.
It is recommended that the custom library be thoroughly tested before production usage.
Develop a Feed Adapter in C
Compiling the C template file
As discussed in the Concepts section, a feed adapter library must export the methods defined by the Feed Adapter C API.
A template C source file containing a stub feed adapter library implementation can be found in the C template section. This template can be used as a starting point when developing your own feed adapter library.
To compile on Linux, copy or save the file to your
machine and run one of the commands shown below. The
APIKIT_DIR
variable should refer to the
directory that the API Kit was extracted to (this directory
should contain an "api" folder). You can either set
APIKIT_DIR
as an environment variable
prior to running the command, or replace the variable
reference with the path directly.
$ gcc -Wall -fPIC -shared -I${APIKIT_DIR} template.c -o template.so
or
$ gcc -Wall -fPIC -shared -I/path/to/kit template.c -o template.so
Reporting errors
Feed Adapter C API calls into the feed adapter library
are all made by the main Netprobe thread. If any function
returns a failure result (i.e. the value 0), the Netprobe
will attempt to obtain a description of the error via the
get_error_text()
function. This
error text will then be output in the Netprobe log.
The get_error_text()
function is
expected to behave in a similar manner to the C library
function snprintf
. It should print a
NUL-terminated string to the input buffer describing the
latest error resulting from an API call, truncating the
message as necessary. If the input buffer is not large
enough to contain the entire message then the length of the
required buffer should be returned, otherwise 0 should be
returned.
The Netprobe will attempt to obtain the error message with an initial buffer size of 2048 (2kB). If this is not sufficient, the call will be repeated with a buffer increased to the size returned by the function.
A reference implementation of this function is provided as part of the C template example code.
Library management
The initialise_library()
function is
called immediately after the feed adapter library is loaded
by the Netprobe. This call provides the feed adapter
library with a set of function pointers used for publishing
data to the Netprobe. These pointers will be valid for the
lifetime of the feed adapter library.
The initialise function can also be used to initialise
any dependent libraries of global state required by all
feeds, perhaps as an alternative to platform specific
shared-object initialisation (e.g. the GCC __attribute__((constructor))
syntax for Unix, or the DllMain function on Windows).
The finalise_library()
function is
called immediately before the feed adapter library is
unloaded by the Netprobe. This function may be used to
perform a tidy shutdown where necessary.
Feed management
The create_feed()
function is called to
create a new feed instance in the feed adapter library.
This function will be called once per unique feed
configuration present on the Netprobe. Multiple plug-in
instances with the same feed configuration will result in
only a single call to create_feed()
, meaning access to
the feed will be shared between plug-in instances.
A feed identifier is generated by the Netprobe and
passed to the feed adapter library. This id will be used to
reference the new feed for the remainder of its lifetime,
and must be provided when publishing data. The feed is also
passed configuration information, as a list of key-value
pairs. The num_params
argument gives the number of
entries present in the keys
and
values
arrays. The code example below
shows how to print all feed parameters when the feed is
created:
int create_feed(feed_id_t id, size_t num_params, const char** keys, const char** values)
{
for (size_t i = 0; i < num_params; ++i)
printf("key '%s' has value '%s'\n", keys[i], values[i]);
return 1;
}
The destroy_feed()
function is called
to destroy a previously created feed when it is no longer
required. Upon completion of the function call, no further
data should be published for the feed. If the feed is
running in a thread, it may be necessary to block in the
destroy_feed()
call until the
thread has been terminated.
Both functions should return non-zero on success. For
failure return zero and ensure the next call to get_error_text()
returns a
populated error message.
During sampling, the MDM plug-in will attempt to obtain
the latest feed status by calling the get_feed_status()
function. This
function should return a short summary of the feed status
that is suitable for display in a dataview. The full error
text for any problems with a feed should be reported via a
log message, or via the get_error_text()
function for API
calls failures.
The get_feed_status()
implementation
should behave as get_error_text()
does; copying the
feed status into the buffer provided. One added
complication with this function is that the feed status
will be requested by the main Netprobe thread, but
(typically) populated by a feed thread. Therefore you must
ensure adequate locking (e.g. by using a mutex) around
access to this data. Alternatively, for feeds with a
limited number of states it may be possible to use a C
sig_atomic_t
type (or C++11 atomic
types) to indicate the current feed state without
locking.
Instrument management
Instrument subscriptions for a given feed are requested
via the subscribe_instrument()
function.
The function is passed a Netprobe-generated identifier for
the subscription (the inst_sub_t
type) along with the
instrument stock code (or other string-based identifier).
The subscription identifier is an opaque pointer to a
subscription record maintained by the Netprobe. This
identifier (along with the feed id) is required information
when publishing data.
The instrument fields that MDM plug-in is configured with are passed as an array of string values. The order of field names in this array is important, because when a tick is published back to the Netprobe the field values are expected to match this same order. Example C code for printing a subscription request is shown below:
int subscribe_instrument(feed_id_t feed, inst_sub_t sub, const char* instrument,
size_t num_fields, const char** field_names)
{
printf("new subscription request for feed(%d), subscription(%p)\n",
feed.id, sub.data);
for (size_t i = 0; i < num_fields; ++i)
printf(" index %02d, field name '%s'\n", i, field_names[i]);
return 1;
}
An unsubscribe_instrument()
API call
will be made by Netprobe when an instrument is removed from
the configuration (or in some cases when another plug-in
instance is added that uses the same feed and instrument).
Once the function call has completed, the subscription id
(inst_sub_t
) becomes invalid and must
not be used to publish any further data. As for the
destroy_feed()
function, for
threaded feeds it may be necessary to block in the
unsubscribe_instrument()
call until
the feed confirms that the instrument is no longer in
use.
Develop a Feed Adapter in C++
Quick Start
The following quick start instructions are for the Linux platform. These instructions should be applicable for other UNIX-based systems, but may require alterations to the Makefile to specify specific compilers or compiler options for the target platform.
- Download the MDM Feed Adapter API Kit package, and a suitable Netprobe for your operating system type (i.e. 32-bit or 64-bit).
- Unpack the contents of both packages into the same directory.
- Obtain an MDM application key from your ITRS support contact (showing that you have accepted the Geneos API agreement) and place it in your Netprobe directory.
- Change to the kit directory, and run
make
to compile the C++ example feed adapter library. - From the same directory, run
maketest
to test the library by executing it with the Netprobe. - Alter the Makefile, example code, and test script as required to implement your own feed adapter with a feed vendor direct market data API.
An example terminal session showing these steps is shown below:
$ ls
mdm_appkey.txt mdmFeedAdapterApiKit.tar.gz netprobe.linux_64.tar.gz
$ mkdir unpack
$ cd unpack
$ tar xf ../netprobe.linux_64.tar.gz
$ tar xf ../mdmFeedAdapterApiKit.tar.gz
$ cp ../mdm_appkey.txt .
$ cd mdmFeedAdapterApiKit
$ make
make -C api
... <compiler output>
Done mdm-feed-api.a
make -C example
... <compiler output>
Done example.so
$ make test
... <Netprobe test output>
<Fri Nov 15 16:54:30> INFO: mdmTest Reading key file 'mdm_appkey.txt', contents listed below.
<Fri Nov 15 16:54:30> INFO: mdmTest Client : ITRS Group
<Fri Nov 15 16:54:30> INFO: mdmTest Contact : Doc-Generator
<Fri Nov 15 16:54:30> INFO: mdmTest Test mode : Enabled
<Fri Nov 15 16:54:30> INFO: mdmTest Gateway mode: Enabled
<Fri Nov 15 16:54:30> INFO: FeedController Create feed request for 'ExampleFeed' from plugin 'mdmTest'.
<Fri Nov 15 16:54:30> INFO: FeedController First-time load of shared-library example/example.so
<Fri Nov 15 16:54:30> INFO: FeedController:AdapterLibrary Successfully loaded feed library 'example/example.so'. Reported version: 'ExampleFeed 1.0.0'.
<Fri Nov 15 16:54:30> INFO: FeedController:AdapterFeed Calling create_feed() with parameters:
example.setting1 => v1
example.setting2 => v2
feed.library.filename => example/example.so
feed.library.skipVersionCheck => true
feed.name => ExampleFeed
feed.type => example
feed.verbose => true
<Fri Nov 15 16:54:30> INFO: ExampleFeed doValidateConfig called
<Fri Nov 15 16:54:30> INFO: ExampleFeed Parameter 'example.setting1' has value 'v1'.
<Fri Nov 15 16:54:30> INFO: ExampleFeed Parameter 'example.setting2' has value 'v2'.
<Fri Nov 15 16:54:30> INFO: ExampleFeed Parameter 'feed.library.filename' has value 'example/example.so'.
<Fri Nov 15 16:54:30> INFO: ExampleFeed Parameter 'feed.library.skipVersionCheck' has value 'true'.
<Fri Nov 15 16:54:30> INFO: ExampleFeed Parameter 'feed.name' has value 'ExampleFeed'.
<Fri Nov 15 16:54:30> INFO: ExampleFeed Parameter 'feed.type' has value 'example'.
<Fri Nov 15 16:54:30> INFO: ExampleFeed Parameter 'feed.verbose' has value 'true'.
<Fri Nov 15 16:54:30> INFO: ExampleFeed doStart called
<Fri Nov 15 16:54:30> INFO: FeedController:AdapterFeed Feed creation successful, feed_id=0.
<Fri Nov 15 16:54:30> INFO: Computing subscription differences for feed ExampleFeed (mdmTest)
<Fri Nov 15 16:54:30> INFO: Existing subscription data: (none)
<Fri Nov 15 16:54:30> INFO: Unaltered subscriptions: (none)
<Fri Nov 15 16:54:30> INFO: New subscriptions:
ExampleFeed: InstCode1 (Inst1) [Ask, Bid, Trade Price]
<Fri Nov 15 16:54:30> INFO: FeedController:AdapterFeed Subscribing to instrument 'InstCode1' for feed 'ExampleFeed' (sub=0xca2acd8). 3 fields; [Ask, Bid, Trade Price]
<Fri Nov 15 16:54:30> INFO: ExampleFeed doSubscribe called
<Fri Nov 15 16:54:30> INFO: ExampleFeed instrument: InstCode1
<Fri Nov 15 16:54:30> INFO: ExampleFeed field 0: Ask
<Fri Nov 15 16:54:30> INFO: ExampleFeed field 1: Bid
<Fri Nov 15 16:54:30> INFO: ExampleFeed field 2: Trade Price
<Fri Nov 15 16:54:30> INFO: ExampleFeed Publishing thread starting
Sample 1 Status OK
ExampleFeed Inst1 Ask=55.01, TradePrice=55.03, Bid=55.02
Sample 2 Status OK
ExampleFeed Inst1 Ask=55.04, TradePrice=55.06, Bid=55.05
Sample 3 Status OK
ExampleFeed Inst1 Ask=55.07, TradePrice=55.09, Bid=55.08
Sample 4 Status OK
ExampleFeed Inst1 Ask=55.1, TradePrice=55.12, Bid=55.11
Sample 5 Status OK
ExampleFeed Inst1 Ask=55.13, TradePrice=55.15, Bid=55.14
<Fri Nov 15 16:54:35> INFO: Computing subscription differences for feed ExampleFeed (mdmTest)
<Fri Nov 15 16:54:35> INFO: Existing subscription data:
InstCode1 [Ask, Bid, Trade Price]
<Fri Nov 15 16:54:35> INFO: Unaltered subscriptions:
ExampleFeed: InstCode1 (Inst1) [Ask, Bid, Trade Price]
<Fri Nov 15 16:54:35> INFO: New subscriptions:
ExampleFeed: InstCode2 (Inst2) [Ask, Bid, Trade Price]
<Fri Nov 15 16:54:35> INFO: FeedController:AdapterFeed Subscribing to instrument 'InstCode2' for feed 'ExampleFeed' (sub=0xca2cf90). 3 fields; [Ask, Bid, Trade Price]
<Fri Nov 15 16:54:35> INFO: ExampleFeed doSubscribe called
<Fri Nov 15 16:54:35> INFO: ExampleFeed instrument: InstCode2
<Fri Nov 15 16:54:35> INFO: ExampleFeed field 0: Ask
<Fri Nov 15 16:54:35> INFO: ExampleFeed field 1: Bid
<Fri Nov 15 16:54:35> INFO: ExampleFeed field 2: Trade Price
Sample 6 Status OK
ExampleFeed Inst1 Ask=55.16, TradePrice=55.18, Bid=55.17
ExampleFeed Inst2 Ask=55.19, TradePrice=55.21, Bid=55.2
Sample 7 Status OK
ExampleFeed Inst1 Ask=55.22, TradePrice=55.24, Bid=55.23
ExampleFeed Inst2 Ask=55.25, TradePrice=55.27, Bid=55.26
Sample 8 Status OK
ExampleFeed Inst1 Ask=55.28, TradePrice=55.3, Bid=55.29
ExampleFeed Inst2 Ask=55.31, TradePrice=55.33, Bid=55.32
Sample 9 Status OK
ExampleFeed Inst1 Ask=55.34, TradePrice=55.36, Bid=55.35
ExampleFeed Inst2 Ask=55.37, TradePrice=55.39, Bid=55.38
Sample 10 Status OK
ExampleFeed Inst1 Ask=55.4, TradePrice=55.42, Bid=55.41
ExampleFeed Inst2 Ask=55.43, TradePrice=55.45, Bid=55.44
<Fri Nov 15 16:54:40> INFO: Computing subscription differences for feed ExampleFeed (mdmTest)
<Fri Nov 15 16:54:40> INFO: Existing subscription data:
InstCode1 [Ask, Bid, Trade Price]
InstCode2 [Ask, Bid, Trade Price]
<Fri Nov 15 16:54:40> INFO: Unaltered subscriptions:
ExampleFeed: InstCode2 (Inst2) [Ask, Bid, Trade Price]
<Fri Nov 15 16:54:40> INFO: New subscriptions: (none)
<Fri Nov 15 16:54:40> INFO: FeedController:AdapterFeed Unsubscribing from instrument 'InstCode1' for feed 'ExampleFeed' (sub=0xca2acd8).
<Fri Nov 15 16:54:40> INFO: ExampleFeed doUnsubscribe called
<Fri Nov 15 16:54:40> INFO: ExampleFeed instrument: InstCode1
Sample 11 Status OK
ExampleFeed Inst2 Ask=55.46, TradePrice=55.48, Bid=55.47
Sample 12 Status OK
ExampleFeed Inst2 Ask=55.49, TradePrice=55.51, Bid=55.5
Sample 13 Status OK
ExampleFeed Inst2 Ask=55.52, TradePrice=55.54, Bid=55.53
Sample 14 Status OK
ExampleFeed Inst2 Ask=55.55, TradePrice=55.57, Bid=55.56
Sample 15 Status OK
ExampleFeed Inst2 Ask=55.58, TradePrice=55.6, Bid=55.59
<Fri Nov 15 16:54:45> INFO: mdmTest Script doSample returned false, exiting...
<Fri Nov 15 16:54:45> INFO: FeedController:AdapterFeed Calling destroy_feed() for feed 'ExampleFeed', feed_id=0.
<Fri Nov 15 16:54:45> INFO: ExampleFeed doStop called
<Fri Nov 15 16:54:45> INFO: ExampleFeed Publishing thread stopping
<Fri Nov 15 16:54:45> INFO: FeedController:AdapterLibrary Unloading feed library 'example/example.so'.
If your output does not look like the above and instead contains several lines saying "Status ERROR: Failed to load shared library", you may be running a 32-bit Netprobe on a 64-bit machine and have compiled a 64-bit example.so file. When run on Linux, this produces the following error message (about 10 lines down in the output):
ERROR: FeedController:AdapterFeed Failed loading shared library for 'mdmTest' feed 'ExampleFeed'.
Error [Failed to load library 'example/example.so'. example/example.so: wrong ELF class: ELFCLASS64]
To fix this, either run with a 64-bit Netprobe, or clean
your output (makeclean
)
and compile a 32-bit example.so file with the command
maketestCFLAGS=-m32LFLAGS=-m32
.
For a more complete description of what is happening in this example, please see the section Testing your C++ feed adapter below.
Introduction
The Feed Adapter C++ API provides an object-orientated set of classes built on top of the C API. The goal of this API is to provide useful functionality common to all feeds, to reduce the development effort required to create a new feed adapter shared-object.
Source code for the C++ classes (both API classes and an
example Feed implementation) are provided with the API Kit
package, and should be compiled with the appropriate
compiler for both the target platform and the direct feed
API (if any) being used. When using the provided
Makefile
, the API classes will be
compiled into a single library archive mdm-feed-api.a
which can then be
re-used by other feed adapter libraries you may
develop.
The output binary should be a shared-object library, which may require specific compiler flags (e.g. to produce position-independent code) to be provided during compilation.
Classes
The following diagram shows the main classes that make up the Feed Adapter C++ API. The shaded classes are those provided by the C++ API. The un-shaded classes are example objects shipped alongside the API, and should be replaced with an actual implementation.
Class | Description |
---|---|
geneos::mdm::feedapi::FeedManager | Manages creation and destruction of Feed instances, and forwards C
API method calls to the appropriate C++ object. |
geneos::mdm::feedapi::FeedFactory | Interface for Feed creation. |
geneos::mdm::feedapi::Feed | An abstract class providing the basic functionality of a feed. |
geneos::mdm::feedapi::InstrumentInfo | A record representing an instrument subscription. |
geneos::mdm::feedapi::Thread | These classes (not shown in the diagram) abstract basic platform-specific threading functionality into a simple object. |
geneos::mdm::feedapi::Runnable | |
geneos::mdm::feedapi::Mutex | This class (not shown in the diagram) abstracts platform-specific mutual exclusion (locking) functionality into a simple object. |
Example::ExampleFeedFactory | A FeedFactory implementation which
creates ExampleFeed instances. |
Example::ExampleFeed | An example concrete implementation of a
Feed . |
Example::ExamplePublisher | A helper class that publishes updates for the
ExampleFeed . |
Working with the API
When using the C++ API, there are 3 steps that you must perform to create a feed adapter library:
- Subclass the abstract
Feed
class and implement the pure-virtual methods. - Write a
FeedFactory
implementation that creates instances of your newFeed
subclass. - Implement the following C API functions. These functions are not implemented directly by the C++ API library because they need customisation.
The source code for the ExampleFeed
class may provide a
useful guide to these steps.
Writing a Feed subclass
The following 5 abstract methods of the Feed
class must be implemented in
order to produce a concrete subclass.
Feed::doValidateConfig
Parse and validate feed configuration parameters.
Feed::doStart
Starts the feed (e.g. begin connecting to the market data system).
Feed::doStop
Stops the feed. (e.g. stop connection to the market data system).
Feed::doSubscribe
Subscribes for instrument data.
Feed::doUnsubscribe
Unsubscribes from an instrument.
These methods correspond roughly to the life-cycle of a feed. A feed will be created for each unique feed configuration added by an MDM plug-in script. This setup is validated and (if successful) the feed started. When started, the feed is expected to establish and maintain a connection to a market data system (or other data source), and transform the data received to be published to the Netprobe.
If started successfully, the feed should continue running until requested to terminate. Termination requests are sent in response to the feed being stopped manually by an MDM plug-in script, or if the MDM plug-in or Netprobe are themselves terminated. A feed may be shared between different MDM plug-in instances; hence a configuration change in one plug-in instance will not necessarily result in a feed being terminated.
While running, the feed may receive a number of subscribe / unsubscribe requests for instrument data. These requests are sent in response to configuration changes to the instrument configuration. The feed should respond to these by making the appropriate requests to the connected market data system.
Feed status reporting
While running, feeds should maintain a short status description (to be displayed in a dataview) that is updated in response to changes to the feed state.
The status should be an indication of the general
status of the feed; i.e. whether it is able to receive
data or not. Full error text should instead be reported
via logging, either using a feed-specific mechanism or
output to the Netprobe log via the thread-safe Feed::log
method. Errors for
particular instruments, or the full text of error
messages should be reported via the API-call error
mechanism or logged.
The feed status can be updated by calling the
Feed::setStatus
method. This method is thread-safe and can be called by
any thread.
Data publishing
When a subscription request is made to a feed, an
InstrumentInfo
object is passed
as an argument. This object is specific to the feed, and
is not visible to any other feeds hosted by the feed
adapter library. It is used not only identifies the
instrument to subscribe to, but is also intended to
simplify data publishing to the Netprobe.
Pseudo-code showing the expected usage of the
InstrumentInfo
class is as
follows:
Feed& feed = ...
InstrumentInfo& info = ...
// if an image or update was received
if (dataReceived())
{
info.newUpdate();
// foreach field in data
for (Field field = data.fields.begin(); field != data.fields.end(); ++field)
if (info.isFieldOfInterest(field->name))
info.setFieldValue(field->name, field->value)
feed.publishImage (info); // -- if received an instrument image
feed.publishUpdate(info); // -- if an update was received
}
The Feed::publishImage
and Feed::publishUpdate
methods are
both thread-safe. InstrumentInfo
objects are not
thread-safe, and should only be accessed by a single
thread at a time.
Testing your C++ feed adapter
You can test your feed adapter library using a Netprobe,
by running it with the -mdmtest
command-line option. This option loads an MDM sampler
script (written in Lua) in stand-alone (testing) mode,
meaning it can be run without requiring a Geneos Gateway
process and associated configuration. Using this option the
Netprobe will load and exercise your feed adapter library,
in accordance with the contents of the Lua script.
The API Kit package comes with a Lua script to briefly
exercise the C++ example feed. As per the Quick
Start instructions, running maketest
will execute the
script. You can run the script manually if required, by
executing the following command:
$ cd mdmFeedAdapterApiKit
$ ../netprobe.linux_64 -mdmtest scripts/mdmTest_example.so.lua
The Lua script file mdmTest_example.so.lua
is written using
the data processing / analysis API to control the MDM
plug-in. (For more information on the APIs used by this
file, please refer to the MDM plug-in documentation.) The
top part of the file contains utility functions to print
all ticks received from the example feed to the Active Console,
allowing visual confirmation of published data.
Following this is the feed configuration, which is
displayed in the code block below. This configuration
creates a new feed, using the specified shared-object file
(see the first highlighted line). This particular
configuration passes two parameters to the feed;
setting1
and setting2
(the second highlighted line), which will be received in
addition to the standard feed parameters in the Feed::doValidateConfig
method. When
started, the feed will be created and subscribe to the
specified instruments using the configured field codes.
-- Create an example feed, subscribed to a single instrument
local exampleFeed = md.addFeed("ExampleFeed", {
feed = {
type = "example",
verbose = "true",
library = { filename = "example/example.so", skipVersionCheck = "true" }
},
example = { setting1 = "v1", setting2 = "v2" },
instruments = { Inst1 = "InstCode1" },
fields = { Ask = "Ask", Bid = "Bid", TradePrice = "Trade Price" }
})
exampleFeed:start()
The last part of the script contains the sampling
configuration. Each sample (every second) the doSample
function will be called by the Netprobe. This function
prints the feed status and any new ticks received from the
feed since the last sample. In addition to this, the
function also simulates new instrument subscribe and
unsubscribe events (see highlighted lines).
-- Our sample method, called every second
local count = 0
gs.doSample = function()
count = count + 1
print("Sample "..count, "Status "..exampleFeed:getStatus())
printTicks(exampleFeed, "Inst1")
printTicks(exampleFeed, "Inst2")
if count == 5 then
-- Subscribe to a new instrument
exampleFeed:subscribe("Inst2", "InstCode2")
elseif count == 10 then
-- Unsubscribe from an existing instrument
exampleFeed:unsubscribe("Inst1")
elseif count >= 15 then
-- End the test
return false
end
end
Note: Users are encouraged to edit the MDM Test script in order to test their specific feed adapter implementation.
In particular it is likely that the feed parameters, instrument codes and field names will need to be changed in order to subscribe to real data with the feed adapter library.
Altering the doSample function may also be necessary to adjust timings for dynamically adding and removing instrument subscriptions.
Developing a Feed Adapter in Lua
Quick Start
To get started with an example Lua feed implementation please follow the steps below:
- Download the API Kit package, and a suitable Netprobe for your operating system.
- Unpack the contents of both packages into the same directory.
- Obtain an MDM application key from your ITRS support contact (showing that you have accepted the Geneos API agreement) and place it in your Netprobe directory.
- Run the Netprobe with the
-mdmtest
command-line option, to execute the example in stand-alone mode (see example below). - Alter the
mdmFeedAdapterApiKit/scripts/inputFeed.lua
script as required to implement your own feed adapter.
An example terminal session showing these steps on Linux
is shown below. For other platforms, you will need to
change the Netprobe suffix in the commands below (e.g. use
netprobe.sunx86 instead of netprobe.linux). Users on
Windows machines may also need to alter the path separators
to use \
(backslash) instead of /
(slash).
$ ls
mdm_appkey.txt mdmFeedAdapterApiKit.tar.gz netprobe.linux.tar.gz
$ mkdir unpack
$ cd unpack
$ tar xf ../netprobe.linux.tar.gz
$ tar xf ../mdmFeedAdapterApiKit.tar.gz
$ cp ../mdm_appkey.txt .
$ ./netprobe.linux -mdmtest mdmFeedAdapterApiKit/scripts/mdmTest_geneos-feed-lua.so.lua
... <Netprobe test output>
<Fri Nov 15 17:04:56> INFO: mdmTest Reading key file 'mdm_appkey.txt', contents listed below.
<Fri Nov 15 17:04:56> INFO: mdmTest Client : ITRS Group
<Fri Nov 15 17:04:56> INFO: mdmTest Contact : Doc-Generator
<Fri Nov 15 17:04:56> INFO: mdmTest Test mode : Enabled
<Fri Nov 15 17:04:56> INFO: mdmTest Gateway mode: Enabled
<Fri Nov 15 17:04:56> INFO: FeedController Create feed request for 'LuaFeed' from plugin 'mdmTest'.
<Fri Nov 15 17:04:56> INFO: FeedController First-time load of shared-library geneos-feed-lua.so
<Fri Nov 15 17:04:56> INFO: FeedController:AdapterLibrary Successfully loaded feed library 'flm/geneos-feed-lua.so'. Reported version: 'RA-131115'.
<Fri Nov 15 17:04:56> INFO: FeedController:AdapterFeed Calling create_feed() with parameters:
feed.library.filename => geneos-feed-lua.so
feed.name => LuaFeed
feed.type => lua
feed.verbose => true
lua.script => mdmFeedAdapterApiKit/scripts/inputFeed.lua
lua.value.max => 1000
lua.value.min => 1
<Fri Nov 15 17:04:56> INFO: FeedController:AdapterFeed Feed creation successful, feed_id=0.
<Fri Nov 15 17:04:56> INFO: Computing subscription differences for feed LuaFeed (mdmTest)
<Fri Nov 15 17:04:56> INFO: Existing subscription data: (none)
<Fri Nov 15 17:04:56> INFO: Unaltered subscriptions: (none)
<Fri Nov 15 17:04:56> INFO: New subscriptions:
LuaFeed: CODE.INST.1 (Inst1) [Ask, Bid, Trade]
LuaFeed: CODE.INST.2 (Inst2) [Ask, Bid, Trade]
<Fri Nov 15 17:04:56> INFO: FeedController:AdapterFeed Subscribing to instrument 'CODE.INST.1' for feed 'LuaFeed' (sub=0xb823c70). 3 fields; [Ask, Bid, Trade]
<Fri Nov 15 17:04:56> INFO: FeedController:AdapterFeed Subscribing to instrument 'CODE.INST.2' for feed 'LuaFeed' (sub=0xb823870). 3 fields; [Ask, Bid, Trade]
<Fri Nov 15 17:04:56> INFO: LuaFeed Parameters:
<Fri Nov 15 17:04:56> INFO: LuaFeed Parameter 'feed.library.filename' has value 'geneos-feed-lua.so'
<Fri Nov 15 17:04:56> INFO: LuaFeed Parameter 'lua.value.min' has value '1'
<Fri Nov 15 17:04:56> INFO: LuaFeed Parameter 'feed.verbose' has value 'true'
<Fri Nov 15 17:04:56> INFO: LuaFeed Parameter 'lua.value.max' has value '1000'
<Fri Nov 15 17:04:56> INFO: LuaFeed Parameter 'feed.name' has value 'LuaFeed'
<Fri Nov 15 17:04:56> INFO: LuaFeed Parameter 'feed.type' has value 'lua'
<Fri Nov 15 17:04:56> INFO: LuaFeed Parameter 'lua.script' has value 'mdmFeedAdapterApiKit/scripts/inputFeed.lua'
<Fri Nov 15 17:04:56> INFO: LuaFeed
<Fri Nov 15 17:04:56> INFO: LuaFeed Using minimum = 1
<Fri Nov 15 17:04:56> INFO: LuaFeed Using maximum = 1000
<Fri Nov 15 17:04:56> INFO: LuaFeed Added subscription for instrument 'CODE.INST.1'
<Fri Nov 15 17:04:56> INFO: LuaFeed Added subscription for instrument 'CODE.INST.2'
Sample 1 Status OK
LuaFeed Inst1 F3=591, F2=795, F1=699
LuaFeed Inst2 F3=662, F2=754, F1=81
Sample 2 Status OK
LuaFeed Inst1 F3=427, F2=340, F1=736
LuaFeed Inst2 F3=55, F2=569, F1=115
Sample 3 Status OK
LuaFeed Inst1 F3=176, F2=975, F1=706
LuaFeed Inst2 F3=355, F2=975, F1=277
Sample 4 Status OK
LuaFeed Inst1 F3=576, F2=473, F1=680
LuaFeed Inst2 F3=183, F2=797, F1=452
Sample 5 Status OK
LuaFeed Inst1 F3=597, F2=660, F1=212
LuaFeed Inst2 F3=584, F2=145, F1=830
<Fri Nov 15 17:05:01> INFO: Computing subscription differences for feed LuaFeed (mdmTest)
<Fri Nov 15 17:05:01> INFO: Existing subscription data:
CODE.INST.1 [Ask, Bid, Trade]
CODE.INST.2 [Ask, Bid, Trade]
<Fri Nov 15 17:05:01> INFO: Unaltered subscriptions:
LuaFeed: CODE.INST.1 (Inst1) [Ask, Bid, Trade]
<Fri Nov 15 17:05:01> INFO: New subscriptions: (none)
<Fri Nov 15 17:05:01> INFO: FeedController:AdapterFeed Unsubscribing from instrument 'CODE.INST.2' for feed 'LuaFeed' (sub=0xb823870).
<Fri Nov 15 17:05:01> INFO: LuaFeed Removed subscription for instrument 'CODE.INST.2'
Sample 6 Status OK
LuaFeed Inst1 F3=50, F2=25, F1=837
Sample 7 Status OK
LuaFeed Inst1 F3=827, F2=830, F1=627
Sample 8 Status OK
LuaFeed Inst1 F3=922, F2=181, F1=256
<Fri Nov 15 17:05:04> INFO: Computing subscription differences for feed LuaFeed (mdmTest)
<Fri Nov 15 17:05:04> INFO: Existing subscription data:
CODE.INST.1 [Ask, Bid, Trade]
<Fri Nov 15 17:05:04> INFO: Unaltered subscriptions:
LuaFeed: CODE.INST.1 (Inst1) [Ask, Bid, Trade]
<Fri Nov 15 17:05:04> INFO: New subscriptions:
LuaFeed: NEW.CODE.INST.2 (Inst2) [F1, F3]
<Fri Nov 15 17:05:04> INFO: FeedController:AdapterFeed Subscribing to instrument 'NEW.CODE.INST.2' for feed 'LuaFeed' (sub=0xb823940). 2 fields; [F1, F3]
<Fri Nov 15 17:05:04> INFO: LuaFeed Added subscription for instrument 'NEW.CODE.INST.2'
Sample 9 Status OK
LuaFeed Inst1 F3=355, F2=317, F1=79
LuaFeed Inst2 F3=922, F1=26
Sample 10 Status OK
LuaFeed Inst1 F3=554, F2=677, F1=745
LuaFeed Inst2 F3=118, F1=134
<Fri Nov 15 17:05:06> INFO: mdmTest Script doSample returned false, exiting...
<Fri Nov 15 17:05:06> INFO: FeedController:AdapterFeed Calling destroy_feed() for feed 'LuaFeed', feed_id=0.
<Fri Nov 15 17:05:06> INFO: LuaFeed Terminate event received, ending feed script
<Fri Nov 15 17:05:07> INFO: LuaFeed Finished executing script 'mdmFeedAdapterApiKit/scripts/inputFeed.lua' for feed 'LuaFeed'.
<Fri Nov 15 17:05:07> INFO: FeedController:AdapterLibrary Unloading feed library 'flm/geneos-feed-lua.so'.
This command executes a Lua feed adapter
input script mdmFeedAdapterApiKit/scripts/inputFeed.lua
.
By running the Netprobe with the -mdmtest
command-line option the script can be run stand-alone,
without requiring a Geneos gateway process and associated
configuration. The argument of the -mdmtest
option actually specifies an MDM sampler
script mdmTest_geneos-feed-lua.so.lua
, the
content of which is described more fully in the Market Data
Monitor (MDM) plug-in documentation.
Note: the two Lua scripts serve different purposes and are not the same:
inputFeed.lua
is an feed adapter written in Lua, using the API discussed in this section.mdmTest_geneos-feed-lua.so.lua
is a sampler script used to exercise the feed adapter.
Introduction
Lua feed adapter scripts are executed by the standard Lua feed adapter provided by ITRS. Multiple feed adapter scripts can be executed simultaneously, each one providing access to a different market data feed. Each script is executed in a separate thread so that the feeds can publish data to the Netprobe while other monitoring tasks are performed. No data is shared between different Lua scripts unless they explicitly communicate with each other in some way (e.g. using files or via a TCP/IP connection).
Lua feed adapter input scripts should be written using version 5.1 of the Lua language. Some Netprobe platforms support Just-In-Time compilation of Lua code via the LuaJIT implementation of Lua, which provides improved execution speed of the Lua script. These platforms also allow use of the Lua FFI library, which allows calling of C functions from Lua.
Netprobe type | Lua implementation |
---|---|
linux | LuaJIT 2.0.2 |
linux_64 | LuaJIT 2.0.2 |
sun | Lua 5.1.5 |
sunx86 | Lua 5.1.5 |
sunx86_64 | Lua 5.1.5 |
windows | LuaJUT 2.0.2 |
Writing the Lua feed adapter script
Getting a script running
The Lua script is expected to run for the lifetime of the feed, maintaining a connection to the market data system (or other source). Incoming instrument data from the source system should be translated to a format which can be published to the Netprobe. If the script exits unexpectedly, the feed status changes to an error state which can be detected in the MDM sampler script.
For example, when running the Quick
Start with an empty inputFeed.lua
script the following
output is displayed.
$ ./netprobe.linux -mdmtest mdmFeedAdapterApiKit/scripts/mdmTest_geneos-feed-lua.so.lua ... <some output omitted> Sample 8 Status ERROR: Script exited Sample 9 Status ERROR: Script exited Sample 10 Status ERROR: Script exited Lua test script sampler.doSample returned false, exiting... <Fri Sep 20 08:36:32> INFO: FeedController:AdapterFeed Calling destroy_feed() for feed 'LuaFeed', feed_id=0. <Fri Sep 20 08:36:32> INFO: FeedController:AdapterLibrary Unloading feed library 'flm/geneos-feed-lua.so'.
If we ensure that inputFeed.lua
never exits by adding
an infinite loop, then we get the following:
-- infinite loop, sleeps for 1 second each iteration while true do feed.sleep(1.0) end
$ ./netprobe.linux -mdmtest mdmFeedAdapterApiKit/scripts/mdmTest_geneos-feed-lua.so.lua ... <some output omitted> Sample 8 Status OK Sample 9 Status OK Sample 10 Status OK Lua test script sampler.doSample returned false, exiting... <Thu Sep 19 16:14:39> INFO: FeedController:AdapterFeed Calling destroy_feed() for feed 'LuaFeed', feed_id=0. ... <test never terminates>
The test does not terminate because the feed adapter script is still running due to the infinite loop, even though the sampler script has now finished. To end the test you will need to Ctrl+C in your terminal window or kill the Netprobe process.
Responding to events
The "failure to terminate" problem we just encountered occurs because the Lua script is executing in a thread. This means the script can run asynchronously from the Netprobe, allowing tick data to be published even while the Netprobe is busy with other monitoring tasks. However since it is the main Netprobe thread which manages the feeds, requests to the feed adapter (such as asking it to terminate) are sent via an event queue.
Checking for events is done using the getEvent()
function, which is
provided as part of the Lua "feed" module. This function
checks the event queue and returns the first event on the
queue, or nil
if there are no events. getEvent()
returns several types
of event, but to fix our immediate problem we are only
interested in the terminate
event.
Updating our code to respond to the terminate event now gives us the following:
local run = true while run do local event = feed.getEvent() if event and event.type == "terminate" then -- respond to terminate event by ending the loop and exiting the script print("terminating feed script") run = false end feed.sleep(1.0) end
$ ./netprobe.linux -mdmtest mdmFeedAdapterApiKit/scripts/mdmTest_geneos-feed-lua.so.lua ... <some output omitted> Sample 9 Status OK Sample 10 Status OK Lua test script sampler.doSample returned false, exiting... <Fri Sep 20 08:44:11> INFO: FeedController:AdapterFeed Calling destroy_feed() for feed 'LuaFeed', feed_id=0. terminating feed script <Fri Sep 20 08:44:12> INFO: LuaFeed Finished executing script 'scripts/luaFeed_Input.lua' for feed 'LuaFeed'. <Fri Sep 20 08:44:12> INFO: FeedController:AdapterLibrary Unloading feed library 'flm/geneos-feed-lua.so'.
Other events that we may receive are the subscribe
and unsubscribe
events. These events
indicate requests from the Netprobe that we subscribe for
(or unsubscribe from) a particular instrument. These
instruments will be configured by the MDM sampler script.
To identify which event we have received, check the
type
field of the event object.
local run = true while run do local event = feed.getEvent() if event then if event.type == "terminate" then run = false elseif event.type == "subscribe" then -- TODO subscribe to the instrument on the market data system elseif event.type == "unsubscribe" then -- TODO unsubscribe from the instrument on the market data system end end feed.sleep(1.0) end
We can tidy this code up somewhat by writing a separate function to handle each type of event. If we store these functions in a map using the event type as a key, we can then use the type name to lookup the appropriate handler function for the event and call it directly.
local run = true local handler_map = { terminate = function(event) run = false end, subscribe = function(event) print("TODO subscribe") end, unsubscribe = function(event) print("TODO unsubscribe") end } while run do local event = feed.getEvent() if event then local handler_func = handler_map[event.type] if handler_func then -- call the handler, passing the event as the argument handler_func(event) end end feed.sleep(1.0) end
Finally, one remaining problem we have is that the script will currently sleep for 1 second between processing events from the Netprobe. This is fine for terminate events since you will only receive one such event near the end of the script lifetime, but you may receive multiple subscribe and unsubscribe events which should be processed promptly. We fix this last issue by checking for more events after processing.
The code below now forms a simple event loop for
responding to Netprobe events. You will see parts of this
code in the inputFeed.lua
example file. When
writing your own feed adapter script implementation, the
feed.sleep(1.0)
call should be
replaced with a check (or wait) for data from the market
data system. If waiting for data, this should ideally be
a timed wait so that the script still checks for Netprobe
events even when data is not being received.
local run = true local handler_map = { terminate = function(event) run = false end, subscribe = function(event) print("TODO subscribe") end, unsubscribe = function(event) print("TODO unsubscribe") end } local function checkEvents() local event = feed.getEvent() while event do local handler_func = handler_map[event.type] if handler_func then handler_func(event) end event = feed.getEvent() end end while run do checkEvents() feed.sleep(1.0) -- TODO replace with check for data/events from the market data system end
Subscribing and publishing
Instruments to subscribe to are sent to the feed
script from Netprobe as a subscribe
event. The event
contains the stock-code or RIC of the instrument, and the
names of the fields for this instrument which should be
published. An example subscribe event is shown below. To
remove an instrument, Netprobe sents an unsubscribe
event.
local subscribeEvent = { type = "subscribe", instrument = "MY.CODE.N", fields = { ASK="", BID="", COMMENT="" } }
This table structure is similar to that of a tick
which is used for
publishing:
local tick = { time = 1379409456.763708, instrument = "MY.CODE.N", fields = { ASK=128.75, BID=128.50, COMMENT="A string value" } } feed.publish(tick)
Because the two types are so similar, you can create a
tick from a subscribe event by simply removing the
type
field and adding a time
field (which indicates when the tick was received). The
time field should then be updated every time the tick
field values are updated, before calling feed.publish()
. An example of
this can be found in the inputFeed.lua
script.
Feed Parameters
Parameters to the Lua feed script are passed in the
params
table. Both keys and
values in this table are strings. A number of default
parameters will be passed, including:
feed.name
- The name of the feed.feed.library.filename
- The filename of the ITRS standard Lua feed adapter shared-object.lua.script
- The Lua script being executed.
Other user-defined configuration parameters (as
defined in the MDM sampler script when the feed is added)
will also appear in this table. When using the test
script to exercise the feed, you can add additional
parameters by adding new fields to the lua
table (see below) in the call to addFeed. Nested tables
in this structure will be flattened to a dotted-name
parameter key.
local lf = md.addFeed( "LuaFeed", { feed = { type = "lua", verbose = "true", library = { filename = "geneos-feed-lua.so" } }, lua = { script = "inputFeed.lua", value = { min = 1, max = 1000 } }, instruments = { Inst1 = "CODE.INST.1", Inst2 = "CODE.INST.2" }, fields = { F1 = "Trade", F2 = "Ask", F3 = "Bid" } } )
This configuration results in the following parameters
being passed to the inputFeed.lua
script:
feed.params = { ["feed.name"] = "LuaFeed", ["feed.type"] = "lua", ["feed.verbose"] = "true", ["feed.library.filename"] = "geneos-feed-lua.so", ["lua.script"] = "inputFeed.lua", ["lua.value.min"] = "1", ["lua.value.max"] = "1000" }
Configuration using a Geneos Gateway
To use your feed in a live MDM plug-in (as opposed to the Netprobe stand-alone mode) you should:
- Deploy your feed script to a location accessible by the Netprobe.
- Configure an MDM plug-in on the Gateway as described in the plug-in documentation.
- Adjust the MDM sampler script (if necessary) to reference your feed script.
Aside from the script location settings, from the point-of-view of the MDM sampler script the Lua feed adapter is as a regular feed.