Event List and Event Details Integration Guide

Welcome!

Smarter AI is an enablement software platform for AI cameras that see, listen, and understand.

Smarter AI platform is comprised of:

  • cameras / camera libraries
  • cloud servers / services
  • mobile viewer apps / libraries

Smarter AI platform supports event triggers based on any combination of:

  • inferencing jobs and workflows
  • sensor data
  • telematics data

This guide will show you how to integrate event lists and event details into your web applications via Smarter AI server/service REST APIs.

Event List and Details

Here is an illustration of how Events and Event Details can be represented in an Events Dashboard. On the left, is a list of Events, and on the right is a detailed view of this particular Event.

Diving into the steps needed for creating bricks of your future monitoring dashboard is what we do next. Please note that both cameras and devices will be referred to as cameras.


Step 1. Getting Ready

1.1. Before we start, make sure you have the following:

  • Tenant Name → Your tenant is your private working space for your users, cameras, events, data, and more.
  • API Key → The key to access your tenant through Smarter AI Rest APIs.

Please contact Smarter AI, if you're missing your Tenant Name or an API Key.

1.2. Please ensure that your cameras are up to date with the latest Camera App, System Image, OEM Image, etc.


Step 2. Get the list of Events

In this section, we'll discuss:

  • Get the list of cameras.
  • Get the list of Events for a specific camera or for all cameras.

2.1. Retrieve a list of all cameras in your tenant. Pick one (by copying its id number) - it will be used in the next command. This API has several parameters:

  • type - endpoint type:
    • APP - mobile application client
    • DEVICE - a dashcam, a gateway, etc.
  • status - device or app status
    • ACTIVE - onboarded and operational
    • RESET - removed from this tenant
curl --request GET \
     --url https://api.smarterai.com/v4/endpoints?type=device&status=active \
     --header 'Accept: application/json' \
     --header 'SAI-Key: YOUR_API_KEY'
[
  {
    "userId": [USERID],
    "status": "ACTIVE",
    "id": [ENDPOINTID],
    "type": "DEVICE"
  },
  
...

  {
    "userId": [USERID],
    "status": "ACTIVE",
    "id": [ENDPOINTID],
    "type": "DEVICE"
  }
]

📘

Note

userId will be used to get a camera user name.

2.2. Get the user's details (userId saved previously).

curl --request GET \
     --url https://api.smarterai.com/v4/endpoints/[ENDPOINTID]/acl \
     --header 'Accept: application/json' \
     --header 'SAI-Key: YOUR_API_KEY'
[
  {
    "userId": [USERID],
    "name": "User",
    "email": "[email protected]",
    "role": "OWNER",
    "imageUrl": "IMG_URL"
  }
]

📘

Note

Multiple users can have access to the same camera with identical or different roles.

2.3. Get events' count.

curl --head \
     --url https://api.smarterai.com/v4/events \
     --header 'Accept: application/json' \
     --header 'SAI-Key: YOUR_API_KEY'
HTTP/1.1 200 
Vary: Origin
Vary: Access-Control-Request-Method
Vary: Access-Control-Request-Headers
X-Total-Count: 131
X-Content-Type-Options: nosniff
X-XSS-Protection: 1; mode=block
Cache-Control: no-cache, no-store, max-age=0, must-revalidate
Pragma: no-cache
Expires: 0
Strict-Transport-Security: max-age=31536000 ; includeSubDomains
X-Frame-Options: DENY
Set-Cookie: JSESSIONID=D458E0E3F96F5326C8805DB13C79A7DD; Path=/; Secure; HttpOnly
Content-Type: application/json
Transfer-Encoding: chunked
Date: Fri, 05 Aug 2022 11:11:10 GMT

📘

Note

In our example, the X-Total-Count value is 131.

2.4. Get an Event List for a camera. You can set several parameters as well:

  • endpointId
  • device id
  • startTime & endTime (define the time range)
curl --request GET \
     --url https://api.smarterai.com/v4/events?endpointId=[ENDPOINTID]&startTime=1658217600000&endTime=1658476800000 \
     --header 'Accept: application/json' \
     --header 'SAI-Key: YOUR_API_KEY'
[
  {
    "id": "818690217686328",
    "endpointId": [ENDPOINTID],
    "timestamp": 1659705818041,
    "triggerId": "b63ad4e0-ebb3-456c-9649-0c3d8c463965",
    "triggerName": "My Trigger",
    "deviceLabel": "CAM01020000",
    "snapshot1": "SNAPSHOT_URL1",
    "snapshot2": "SNAPSHOT_URL2"
  },

...

  {
    "id": "818667314111082",
    "endpointId": [ENDPOINTID],
    "timestamp": 1659703526909,
    "triggerId": "b63ad4e0-ebb3-456c-9649-0c3d8c463965",
    "triggerName": "My Trigger",
    "deviceLabel": "CAM01020002",
    "snapshot1": "SNAPSHOT_URL1",
    "snapshot2": "SNAPSHOT_URL2"
  },
]

📘

Note

Here are the key parts of the API response:

  • triggerName
  • deviceLabel
  • snapshot1 → imager 1 snapshot
  • snapshot2 → imager 2 snapshot

2.5. Get an Event List for all cameras.

# Array with device IDs (continue array if needed)
deviceIDs=("0000", "0001", ...)
for item in "${deviceIDs[@]}"
do
  url='https://api.smarterai.com/v4/events?endpointId='$item'&startTime=1658217600000&endTime=1658476800000'
  curl --request GET \
     --url $url \
     --header 'Accept: application/json' \
     --header 'SAI-Key: YOUR_API_KEY'

  # Process results
  # ...
done

Here is what is provided by the script above:

  1. Events Count
  2. Video Snapshots
  3. Device Name
  4. User Name
  5. Trigger Name (will be used later)

Event List Dashboard Left Part Decomposition


Step 3. Event Details

Get access to all the data and details of an Event

3.1. Get the video and sensor data corresponding to an event. This request provides Event related data and information, such as:

  • Pointers (URLs) to media files (dual and multi-imager support)
  • Sensors data such as accelerometer, gyroscope, location, decoded vehicle data (CAN bus), etc.
curl --request GET \
     --url https://api.smarterai.com/v4/events/804774584580474 \
     --header 'Accept: application/json' \
     --header 'SAI-Key: YOUR_API_KEY'
{
  "id": "804774584580474",
  "tenantId": "[TENANTID]",
  "endpointId": [ENDPOINTID],
  "deviceId": "CAM01020002",
  "startTimestamp": 1658314255636,
  "endTimestamp": null,
  "sources": [
    "Sensor",
    "Sensor"
  ],
  "triggerId": "2693baea-e4a8-495b-a39f-309f277f5e6d",
  "triggerName": "Right Turn",
  "deviceLabel": "John Doe's Car Camera",
  "userInitiatedEventRequestId": null,
  "snapshot1": "[SNAPSHOT1_URI]",
  "snapshot2": "[SNAPSHOT2_URI]",
  "recordings": [
    {
      "id": 1395712,
      "startTimestamp": 1658314225237,
      "endTimestamp": 1658314258167,
      "source": "sensor",
      "type": "DATA",
      "url": "[RECORD_ID]"
    },
    {
      "id": 1395752,
      "startTimestamp": 1658314258261,
      "endTimestamp": 1658314291179,
      "source": "sensor",
      "type": "DATA",
      "url": "[RECORD_ID]"
    },
    {
      "id": 1395750,
      "startTimestamp": 1658314240324,
      "endTimestamp": 1658314273394,
      "source": "vid_1",
      "type": "VIDEO",
      "url": "[RECORD_ID]"
    },
    {
      "id": 1395784,
      "startTimestamp": 1658314273448,
      "endTimestamp": 1658314306386,
      "source": "vid_1",
      "type": "VIDEO",
      "url": "[RECORD_ID]"
    },
    {
      "id": 1395754,
      "startTimestamp": 1658314240759,
      "endTimestamp": 1658314274093,
      "source": "vid_2",
      "type": "VIDEO",
      "url": "[RECORD_ID]"
    },
    {
      "id": 1395835,
      "startTimestamp": 1658314274167,
      "endTimestamp": 1658314307297,
      "source": "vid_2",
      "type": "VIDEO",
      "url": "[RECORD_ID]"
    },
    {
      "id": 1395727,
      "startTimestamp": 1658314238364,
      "endTimestamp": 1658314270697,
      "source": "vid_2",
      "type": "VIDEO_META",
      "url": "[RECORD_ID]"
    },
    {
      "id": 1395774,
      "startTimestamp": 1658314271504,
      "endTimestamp": 1658314303516,
      "source": "vid_2",
      "type": "VIDEO_META",
      "url": "[RECORD_ID]"
    }
  ]
}

The API response includes:

  • Snapshot URLs - snapshots of all imagers.
  • Data array items with "source: sensor" and "type: DATA" contains sensor data, including:
    • Accelerometer
    • Gyroscope
    • Geolocation
    • and more

In the Smarter AI system, video is available in the form of clips of equal duration. For a particular Event, you'll have several consecutive video clips.

  • Data array items with "source": "imager1" or "source": "imager2" and "type": "VIDEO" contains video clips URLs. There can be several recording units depending on settings.
    Example: A 1-minute long Event with 10 seconds-long video clips will have between 6 and 8 video clips associated.

  • Data array items with "source": "imager1" or "source": "imager2" and "type": "VIDEO_META" contains video meta data:

    • AI model labels with confidence levels
    • Bounding box coordinates
    • Video resolution

3.2. Understanding Event data.

  • Sensor and other kinds of data (CAN bus, location, etc.) are time synchronized with the video clips.
  • Data temporal resolution is configurable.
    • For video the default is 15Hz usually. The temporal frequency, as well as spatial frequency, depends on the hardware capacity and load (i.e. a more or less powerful camera and a more or less loaded camera).
    • For sensor data: it depends on the sensor hardware and your data limitations, however, it's usually between 1Hz (typically GPS) and 400Hz (Accelerometer).
2 1658314225981 -0.000046 -0.000443 0.000046
0 1658314225982 -0.003280 0.009997 0.052085
0 1658314225983 -0.030869 -0.004143 0.037036
0 1658314225996 -0.020178 0.013456 -0.058421
0 1658314225996 -0.029841 0.005512 -0.052965
0 1658314225996 0.004620 -0.011817 -0.003578
0 1658314225996 0.044609 0.028508 0.010184
2 1658314225996 0.000061 -0.000443 -0.000473
2 1658314225996 0.000565 -0.000443 -0.000610
2 1658314226007 0.000198 -0.000443 -0.000900
2 1658314226007 -0.000244 -0.000031 -0.000671
3 1658314226039 25.199066 55.272907 1.300000 0.000000 0.000000 2.202271
0 1658314226044 0.027424 -0.019660 0.065564
0 1658314226044 0.026949 -0.019217 0.064528
0 1658314226044 0.039667 -0.011731 0.053471
0 1658314226044 0.013204 0.014147 -0.011868
2 1658314226046 0.001038 0.000534 -0.000839
2 1658314226046 0.001236 0.000427 -0.000473
2 1658314226046 0.000259 0.000473 -0.000473
2 1658314226046 0.000275 0.000168 -0.000473

📘

Note

File Format → (CSV format) ID TS VAL1 … VAL6

  • ID - data type identifier
    • 0 - Accelerometer
    • 2 - Gyroscope
    • 3 - GPS Data
  • TS - timestamp
  • VALx:
    • for ID 0 - Accelerometer → Ax Ay Az
    • for ID 2 - Gyroscope → Ax Ay Az
    • for ID 3 - GPS Data → Latitude Longitude Altitude Speed Bearing Accuracy
[
  [
    {
      "detections": [
        [
          "LEFT_EYE_CONFIDENCE",
          99.32,
          "%"
        ],
        [
          "RIGHT_EYE_CONFIDENCE",
          95.49,
          "%"
        ],
        [
          "LEARNT_NEUTRAL_HEADPOSE_PITCH",
          -10.65,
          "°"
        ],
        [
          "LEARNT_NEUTRAL_HEADPOSE_YAW",
          8.79,
          "°"
        ],
        [
          "LEARNT_NEUTRAL_HEADPOSE_ROLL",
          1.85,
          "°"
        ],
        [
          "HEADPOSE_PITCH",
          -15.11,
          "°"
        ],
        [
          "HEADPOSE_YAW",
          0.08,
          "°"
        ],
        [
          "HEADPOSE_ROLL",
          -1.93,
          "°"
        ]
      ],
      "bbox": [
        709,
        164,
        908,
        359
      ],
      "ts": 1658314238364,
      "dimen": [
        720,
        1280
      ]
    }
  ],

  [ ... ]

]

📘

Note

File Format (JSON format)

  • detections - array with Labels data (Text, Value, Character Unit)
  • bbox - size and position of bounding boxes (X1, Y1, X2, Y2)
  • ts - timestamp
  • dimen - frame resolution (Height Width)

Here is an example of how you can render video and sensor data in a GUI.

  1. Video Recordings (dual-imager)
  2. Inferencing Output
  3. Sensor Data
  4. Event Type and Event Date
  5. Camera Location

Event List Dashboard Right Part Decomposition

Data Visualization Examples

Here are some examples of how to display sensor data. These data visualizations are important for providing clear and useful interfaces for operators to investigate the following use cases:

  • Driver Attention Warning
  • Drowsiness Detection
  • Traffic Collision Reconstruction
  • Predictive Vehicle Maintenance
  • Driver Exoneration

Here are examples of visualization for sensor data:

Individual tile's view

Event's Sensor Data Plotting

Combined line chart

Event's Sensor Data Plotting

Camera location map. One can even draw tracks.

Event's Vehicle Location

Event player with integrated and synchronized data view

Event's Event Player Example

References