Access your Exported Data in your own Environment

Once you have successfully created an export process you can access the contents on the selected target connector. The contents may update depending on the delivery frequency of the updating products or assets within the export process. Each defined export uses a unique connector Prefix location which can be used to integrate the data into your own environment.

Connector

Desktop

Once the data has been prepared you can initiate a browser based download through the Download button on the export definition page that has been created.

S3 Bucket

When an S3 connector based export is created a new random GUID named sub-folder is created within the S3 location associated with the connector. The current view of the data content is transferred into this location at the point when the export is created on the Platform.

This data content can then be accessed via any authenticated AWS S3 access methodology.

Google Cloud Storage (GCS)

When a GCS connector based export is created a new random GUID named sub-folder is created within the GCS location associated with the connector. The current view of the data content is transferred into this location at the point when the export is created on the Platform.

This data content can then be accessed via any authenticated GCP Cloud Storage access methodology.

Azure Blob Storage

When an Azure Blob Storage connector based export is created a new random GUID named sub-folder is created within the Azure Blob Storage location associated with the connector. The current view of the data content is transferred into this location at the point when the export is created on the Platform.

This data content can then be accessed via any authenticated Azure Blob Storage access methodology.

SFTP

When you an SFTP connector based export is created, this creates a new random GUID named subfolder within the user specific zone within the platform hosted SFTP server. The current view of the data content is transferred into this location at the point when the export is created on the Platform.

Retrieve the connector SSH key required to access this hosted SFTP server instance from the connector details. This SSH key can be used in any SFTP client to connect to the sever and copy / download the contents as required.

For security purposes only the connector owner can access the SSH key via the platform UI. If you are using an SFTP connector that you do not own, contact the connector owner who will be able to provide you with the connector SSH key to access your export

Download an appropriate SFTP client to allow you to log in to the server (example provided below).

  1. For Mac, recommended client is CyberDuck (FileZilla is also suitable) - https://cyberduck.io/download/

    1. Once downloaded and installed, launch CyberDuck

    2. Select ‘Open connection’

  2. The default process is FTP, ensure that you switch the process to SFTP

  3. Once SFTP has been selected, login to your SFTP server by inserting

    1. Server

    2. Username

    3. SSH Private Key - This must first be downloaded from your SFTP connector, so that it can be uploaded

  4. Once this file is downloaded, upload it to the SSH Private Key field

  5. Select connect to complete server login and access your data

  6. For Windows, recommended client is WinSCP - https://winscp.net/download/WinSCP-5.15.9-Setup.exe  

Delivery Frequency

When creating exports against updating assets the two options below can be chosen for receipt. When creating exports for static products or assets, the behavior is the same as the ‘Static Export’ option below.

Static Export

Static exports provide a one-off export of data content as it exists at the time of the export creation.

When using an connector based delivery mechanism this exported content is written to a subdirectory named unscheduled beneath the GUID based home location of the export.

Repeating Export - TNFs

Repeating exports provide an on-going data feed that automatically push data content to the connector location each time the product or asset updates. Each distinct export delivery comprises of :

  • Delivery of the data to a newly generated GUID sub-folder within the top level home location of the export. This results in a folder structure such as (where the initial path identifier will be cloud specific, here we use AWS S3 for the example) : s3://<connector_bucket>/<export_GUID>/<export_delivery_GUID>/

  • Delivery of a Transfer Notification File (TNF) to the /tnfs/ sub-folder created beneath the top level home location of the export. These TNFs are named using a timestamp convention to track the delivery history and each contain the export_delivery_GUID value of the corresponding data transfer in the DataFolder JSON field to facilitate path resolution.

Cloud Storage event triggers can be set-up against the /tnfs/ folder to programmatically detect content delivery and programmatically ingest into downstream processes where exported content is used in operational workflows


References and FAQs