Fake Data Agent

The Fake Data Agent is provided with OCS to help demonstrate and debug issues with data aggregation and display. It will generate random data and pass it to an OCS feed.

usage: agent.py [-h] [--mode {idle,acq}] [--num-channels NUM_CHANNELS]
                [--sample-rate SAMPLE_RATE] [--frame-length FRAME_LENGTH]

Agent Options

--mode

Possible choices: idle, acq

Default: “idle”

--num-channels

Number of fake readout channels to produce. Channels are co-sampled.

Default: 2

--sample-rate

Frequency at which to produce data.

Default: 9.5

--frame-length

Frame length to pass to the aggregator parameters.

Default: 60

Configuration File Examples

Below are configuration examples for the ocs config file and for running the Agent in a docker container.

OCS Site Config

To configure the Fake Data Agent we need to add a FakeDataAgent block to our ocs configuration file. Here is an example configuration block using all of the available arguments:

{'agent-class': 'FakeDataAgent',
 'instance-id': 'fake-data1',
 'arguments': [['--mode', 'acq'],
               ['--num-channels', '16'],
               ['--sample-rate', '4']]},

Docker Compose

The Fake Data Agent can also be run in a Docker container. An example docker compose service configuration is shown here:

fake-data1:
    image: simonsobs/ocs:latest
    hostname: ocs-docker
    environment:
      - LOGLEVEL=info
      - INSTANCE_ID=fake-data1
    volumes:
      - ${OCS_CONFIG_DIR}:/config:ro

Agent API

class ocs.agents.fake_data.agent.FakeDataAgent(agent, num_channels=2, sample_rate=10.0, frame_length=60)[source]
acq(test_mode=False, degradation_period=None)[source]

Process - Acquire data and write to the feed.

Parameters:
  • test_mode (bool, optional) – Run the acq Process loop only once. This is meant only for testing. Default is False.

  • degradation_period (float, optional) – If set, then alternately mark self as degraded / not degraded with this period (in seconds).

Notes

The most recent fake values are stored in the session data object in the format:

>>> response.session['data']
{"fields":
    {"channel_00": 0.10250430068515494,
     "channel_01": 0.08550903376216404,
     "channel_02": 0.10481891991693446,
     "channel_03": 0.10793263271024509},
 "timestamp":1600448753.9288929}

The channels kept in fields are the ‘faked’ data, in a similar structure to the Lakeshore agents. ‘timestamp’ is the last time these values were updated.

set_heartbeat(heartbeat=True)[source]

Task - Set the state of the agent heartbeat.

Parameters:

heartbeat (bool, optional) – True for on (the default), False for off

delay_task(delay=5, succeed=True)[source]

Task (abortable) - Sleep (delay) for the requested number of seconds.

This can run simultaneously with the acq Process. This Task should run in the reactor thread.

Parameters:
  • delay (float, optional) – Time to wait before returning, in seconds. Defaults to 5.

  • succeed (bool, optional) – Whether to return success or not. Defaults to True.

Notes

The session data will be updated with the requested delay as well as the time elapsed so far, for example:

>>> response.session['data']
{'requested_delay': 5.,
 'delay_so_far': 1.2}