Skip to content

Data Sources & Integrations

Data Sources & Integrations

4.1 Supported Data Types

The platform ingests and processes multiple data types:

TypeExamplesFormat
Quantitative MetricsCarbon emissions, water use, waste volumesNumbers, time series
Financial DataRevenue, expenses, capex, OPEXCurrency, ratios
Survey ResponsesEmployee engagement, stakeholder feedbackLikert scales, open text
Operational DataProduction volumes, workforce statistics, supply chainAggregated counts
Geographic DataLocation coordinates, facility addressesGeoJSON, coordinates
Document DataReports, certifications, compliance documentsPDF, text, structured
Regulatory DataCompliance status, certifications, licensesStatus indicators

4.2 Connecting External Systems

ESG Data Providers

  • Bloomberg Terminal
  • Refinitiv / Thomson Reuters
  • S&P Global ESG
  • Custom data feeds

Financial Systems

  • SAP
  • Oracle ERP
  • QuickBooks
  • Xero

Operational Systems

  • Salesforce
  • Workday
  • ServiceNow
  • Manufacturing MES systems

Surveys & Stakeholder Inputs

  • Custom survey builder (built-in)
  • Qualtrics
  • SurveyMonkey
  • Google Forms

To connect a data source:

  1. Go to Integrations → Data Sources
  2. Select your system from the catalog
  3. Provide authentication credentials (API key, OAuth, username/password)
  4. Map your data fields to platform indicators
  5. Set refresh frequency (real-time, daily, weekly, monthly)
  6. Test connection and validate data

4.3 API Integration Guide

Direct API access enables:

  • Custom data ingestion workflows
  • Real-time data synchronization
  • Programmatic assessment creation
  • Webhook integration for system events

API Endpoints:

  • GET /api/v1/assessments - List assessments
  • POST /api/v1/assessments - Create new assessment
  • PUT /api/v1/assessments/{id}/data - Submit assessment data
  • GET /api/v1/reports - Generate reports programmatically
  • POST /api/v1/webhooks - Set up event listeners

4.4 Data Validation & Quality Assurance

Every data point is validated:

  1. Schema Validation: Data matches expected format and type
  2. Range Checking: Values fall within acceptable parameters
  3. Completeness: Required fields are populated
  4. Anomaly Detection: Unusual spikes or patterns flagged for review
  5. Deduplication: Duplicate entries identified and merged
  6. Audit Trail: Every change logged with timestamp and user

View validation errors:

  • Integrations → Data Import History
  • Filter by status: Successful, Pending, Error
  • Click error to see detailed diagnostic

4.5 Troubleshooting Integrations

Connection Failed

  • Verify credentials are correct and user has API permissions
  • Check IP allowlist if your system uses it
  • Confirm API endpoint is accessible (firewall/proxy)
  • Contact support with API error codes

Data Not Appearing

  • Verify mapping configuration matches your data structure
  • Check data refresh schedule and last sync timestamp
  • Review validation error logs for format issues
  • Ensure user has permission to access source system data

Incomplete Data Transfer

  • Check data volume limits (contact support for enterprise limits)
  • Verify API rate limits are configured appropriately
  • Monitor for timeout issues during large data transfers
  • Use support’s diagnostic tools to trace failed requests

Duplicate or Incorrect Data

  • Verify deduplication rules configured correctly
  • Check field mapping for accuracy
  • Ensure consistent data quality at source
  • Use data reconciliation report to identify discrepancies

For additional help, contact support@sdgl-impact.io or visit the Support Center.