This integration is specifically done for Digital Out of Home Attribution using the DOOH Ad log files from your DSP.
You will need to add a data source for your ad log file to get started with on-boarding DOOH Adlogs.

You can choose cloud storage or a simple file upload to onboard your ad log files.

The DOOH ad log file format needs be a csv file with the headers given below :-
oohid,oohname,latitude,longitude,radius,ad_start_time_utc,ad_play_time,ad_duration_in_seconds,creative,lineid

Example of the ad log file :-


All the above headers and fields are mandatory for the optimum DOOH based attribution results.

You could choose either File upload or cloud storage depending on where you file sits. For files greater than 50 MB, request you to choose the Cloud Storage as a Data Source.

Please follow the steps to one of these source types to integrate your Ad log data source.

  1. File Upload
    If your Ad log data resides in a csv file, you can upload the file with our File Upload source option in a CSV format. Our file upload tool will automatically encrypt your file before uploading it to your secure bucket. Please ensure you prepare your file in the specified format shown in our template.

    a. Select File Upload as Source.

    b. Enter a source name

    1. Indicates the source name you may want to enter for the upload file.

    2. Helps you select the file from your computer that you can upload on to the platform.


      d. Click Next to Map your identifier to our schema.

    1. Indicator that the file has been successfully hashed and uploaded.

    2. Click next button to move the the Schema Mapping section.



      Schema Mapping

    If your file is successfully uploaded, you will be taken to the schema mapping section.

    1. Here, we will require you to map your Ad logs data to the Ad logs data schema. You can see your file headers here under "Your Data".

    2. To map the headers , select the drop down menu under “Lifesight Data” and map the respective DOOH header attributes.

      The final mapped attributes are seen as below which you can save.

    3. Click “Save” to complete adding the source.

  2. Cloud Storage

    If your data is residing in a cloud storage location, you can integrate this source type to securely extract your data from the location.

    GCS as Cloud Storage

    To begin, select a cloud storage provider, shown below is Google Cloud Service as a cloud storage.


    The data from your cloud storage sources is collected via our secure connector integration. After you configure your cloud storage account with your credentials, our platform will extract the data in batches every hour, and transfer it to your secure bunker.


    Setting up Source

    Enter your source name

    1. Enter the following details to configure your source setup

      Bucket Name - GCS bucket name in which your data resides is your globally unique namespace shared across GCS accounts in a given bucket region.
      Bucket Region - Bucket region where your bucket resides.
      Key Prefix - The delimiter and prefix combination for your bucket name in case it resides inside folders.

    Credentials - As a prerequisite, you will need to create a Service account in the preferred bucket region above. This is the JSON file that is created when you create a Service account on. You can upload the JSON file with the credentials as is and Lifesight will extract the required information.

    Schema Mapping

    The rest of the Schema mapping steps is the same as File upload section.

    Amazon S3 as Cloud Storage

    If your Ad logs data is residing in a cloud storage location, you can integrate this source type to securely extract your data from the location.

    To begin, select a cloud storage provider, shown below is Amazon S3 as a cloud storage.


    The data from your cloud storage sources is collected via our secure connector integration. After you configure your cloud storage account with your credentials, our platform will extract the data in batches every hour, and transfer it to your secure bunker.


    Setup Source - Amazon S3

    1. Enter the source name

      Enter the following details to configure your source setup

    2. Key Prefix - The delimiter and prefix combination for your bucket name in case it resides inside folders.

    3. Bucket Region - Bucket region where your bucket resides.

    4. Bucket Name - Amazon S3 bucket name in which your data resides is your globally unique namespace shared across AWS accounts in a given bucket region.

    5. Secret Token - Like the Username/Password pair you use to access your AWS Management Console, Access Key Id and Secret Access Key are used for accessing buckets through APIs.

    6. Access Key - Like the Username/Password pair you use to access your AWS Management Console, Access Key Id and Secret Access Key are used for accessing buckets through APIs.

    7. After filling in all the above details , you can click Next to move to the Schema mapping section.

    Schema Mapping

    The rest of the Schema mapping steps is the same as File upload section.


Did this answer your question?