info@ismena.com
Ismena websiteIsmena websiteIsmena websiteIsmena website
  • Home
  • About us
  • Technologies
    • Cloud Services
      • Google Cloud Platform
        • Networking
        • Compute
        • Storage
        • SAP on GCP
        • Google Maps
        • Data Center Modernization
    • Infrastructure
      • iSolution Services
      • Unified Communication
      • Network Security
      • Access Security & Control
      • Computing Platforms
      • Structured Cabling Infrastructure
      • Datacenter Infrastructure
      • Networking Infrastructure
      • Retail Analytics
      • Cloud Infrastructure
    • Integration
      • Apigee
      • Software AG
      • Custom Connectors
    • Security
      • Security Consulting Services
      • Security Solutions
    • Data & AI
      • BigQuery, Looker
      • Gemini
    • Collaboration Tools
      • Google Workspace For Enterprise
    • ERP-CRM
      • Odoo
      • Salesforce
      • SAP on GCP
    • DevOps
      • GCP
      • SonarSource
    • Managed Service Provider
      • Managed Service Provider
    • App Development
      • App Development
    • Open Banking
      • Open banking
    • Chrome Devices
  • Unplugged Podcast
  • Blog
    • Success Stories
    • News
    • Articles
  • Careers
  • Contact Us

Technologies

Integration

Custom Connectors

Explore All Connectors

Bahrain Open Data Portal Connector

Bahrain Open Data Portal Connector

Connector Details

Type

Virtual machines, Single VM , BYOL

Runs on

Google Compute Engine

Last Update

24 October, 2024

Category

Big data, Analytics, DevOps

Overview

Documentation

Pricing

Support

Overview

The Bahrain Open Data Connector enables seamless integration with the Bahrain Open Data Portal, providing access to catalog and dataset-related data for applications such as data analytics, visualization tools, or geographic information systems. The connector acts as a proxy to streamline data retrieval, supporting actions for dataset enumeration, record queries, data exports, and facet-based navigation.

Integration Overview

This document outlines each integration point, its purpose, configuration, and workflow support using the Bahrain Open Data Connector.

Supported Integration Action Points

  • getDatasets: Retrieves a list of all datasets in the catalog.
  • listExportFormats: Lists available export formats for the dataset catalog.
  • exportDatasets: Exports the dataset catalog in a specified format (e.g., CSV, JSON).
  • exportCatalogCSV: Exports the dataset catalog as CSV with customizable parameters.
  • exportCatalogDCAT: Exports catalog metadata in DCAT-AP format (RDF/XML).
  • getDatasetsFacets: Retrieves global catalog-level facets for guided navigation.
  • getDataset: Retrieves metadata for a specific dataset.
  • getRecords: Retrieves records for a specific dataset.
  • getRecord: Retrieves a specific record from a dataset.
  • listDatasetExportFormats: Lists supported export formats for a dataset.
  • exportRecords: Exports a dataset in a specific format (e.g., CSV, GeoJSON).
  • exportRecordsCSV: Exports a dataset in CSV format with specific parameters.
  • exportRecordsParquet: Exports a dataset in Parquet format.
  • exportRecordsGPX: Exports a dataset in GPX format.
  • getRecordsFacets: Retrieves available facets for a dataset.
  • getDatasetAttachments: Retrieves files or attachments related to a dataset.

Detailed Integration Documentation

Catalog Datasets Retrieval

Action

getDatasets

Purpose

Retrieves a comprehensive list of all datasets in the catalog, serving as the primary entry point for exploring available data.

Parameters

  • Required: None.
  • Optional:
    • select: Specify fields to include (e.g., dataset_id, fields).
    • where: Filter datasets using ODSQL (e.g., publisher=”Information & eGovernment Authority”).
    • order_by: Sort results (e.g., modified desc).
    • limit: Number of items (default: 10, max: 100).
    • offset: Starting index (default: 0).
    • refine: Filter by facet (e.g., publisher:Information & eGovernment Authority).
    • exclude: Exclude facet values (e.g., publisher:Information & eGovernment Authority).
    • lang: Language for formatting (e.g., en).
    • timezone: Timezone for datetime fields (e.g., Asia/Bahrain).
    • group_by: Group results (e.g., publisher).
    • include_links: Include HATEOAS links (boolean, default: false).
    • include_app_metas: Include metadata (boolean, default: false).

Configuration

Ensure the connector is configured with the appropriate environment variables for authentication and connectivity.

Output

  • Successful: Returns a JSON object with total_count, _links, and results (array of dataset objects containing dataset_id, fields, metas, etc.).
  • Failure: Returns error details (e.g., invalid parameters).

Workflow Example

1. Configure the connector for the Bahrain Open Data Portal.
2. Execute the getDatasets action to fetch a list of datasets.
3. Process the response to identify datasets (e.g., population-statistics) for further actions.

Catalog Exports Listing

Action

listExportFormats

Purpose

Lists all available export formats for the dataset catalog, helping users choose an appropriate format for data export.

Parameters

  • Required: None.
  • Optional: None.

Configuration

Ensure the connector is configured with the appropriate environment variables for authentication and connectivity.

Output

  • Successful: Returns a JSON object with a links array listing export format options (e.g., csv, json).
  • Failure: Returns error details (e.g., upstream API error).

Workflow Example

1. Execute the listExportFormats action to retrieve available export formats.
2. Review the response to select a format (e.g., csv) for catalog export.
3. Proceed to the exportDatasets action for export.

Catalog Export by Format

Action

exportDatasets

Purpose

Exports the entire dataset catalog in a user-specified format, such as JSON, CSV, or XLSX, for versatile data use.

Parameters

  • Required: format (e.g., csv, json).
  • Optional:
    • select: Specify fields to include (e.g., dataset_id, fields).
    • where: Filter datasets using ODSQL (e.g., publisher=”Information & eGovernment Authority”).
    • order_by: Sort results (e.g., modified desc).
    • limit_export: Number of items (default: -1, retrieves all records).
    • offset: Starting index (default: 0).
    • refine: Filter by facet (e.g., publisher:Information & eGovernment Authority).
    • exclude: Exclude facet values (e.g., publisher:Information & eGovernment Authority).
    • lang: Language for formatting (e.g., en).
    • timezone: Timezone for datetime fields (e.g., Asia/Bahrain).
    • group_by: Group results (e.g., publisher).
    • include_links: Include HATEOAS links (boolean, default: false).
    • include_app_metas: Include metadata (boolean, default: false).

Configuration

Ensure the connector is configured with the appropriate environment variables for authentication and connectivity.

Output

  • Successful: Returns a file in the specified format.
  • Failure: Returns error details (e.g., unsupported format).

Workflow Example

1. Use the listExportFormats action to identify supported formats.
2. Execute the exportDatasets action with format=csv and limit_export=10.
3. Save the exported CSV file for external analysis.

Catalog CSV Export

Action

exportCatalogCSV

Purpose

Exports the dataset catalog in CSV format with customizable parameters, allowing tailored data extraction for analysis or reporting.

Parameters

  • Required: None.
  • Optional:
    • delimiter: Field delimiter (e.g., ,, default: ;).
    • list_separator: Multivalued string separator (default: ,).
    • quote_all: Quote all strings (boolean, default: false).
    • with_bom: Include Unicode BOM (boolean, default: true).
    • select: Specify fields to include (e.g., dataset_id, fields).
    • where: Filter datasets using ODSQL (e.g., publisher=”Information & eGovernment Authority”).
    • order_by: Sort results (e.g., modified desc).
    • limit_export: Number of items (default: -1, retrieves all records).
    • offset: Starting index (default: 0).
    • refine: Filter by facet (e.g., publisher:Information & eGovernment Authority).
    • exclude: Exclude facet values (e.g., publisher:Information & eGovernment Authority).
    • lang: Language for formatting (e.g., en).
    • timezone: Timezone for datetime fields (e.g., Asia/Bahrain).
    • group_by: Group results (e.g., publisher).
    • include_links: Include HATEOAS links (boolean, default: false).
    • include_app_metas: Include metadata (boolean, default: false).

Configuration

Ensure the connector is configured with the appropriate environment variables for authentication and connectivity.

Output

  • Successful: Returns a CSV file.
  • Failure: Returns error details (e.g., invalid parameters).

Workflow Example

1. Execute the exportCatalogCSV action with delimiter=,.
2. Save the CSV file containing catalog metadata.
3. Use the file for data analysis or reporting.

Catalog DCAT Export

Action

exportCatalogDCAT

Purpose

Exports catalog metadata in RDF/XML (DCAT-AP) format, enabling integration with metadata systems supporting DCAT standards.

Parameters

  • Required: dcat_ap_format (e.g., dcat_ap_ch, dcat_ap_de).
  • Optional:
    • include_exports: Export formats to include (e.g., csv,json).
    • use_labels_in_exports: Use field labels (boolean, default: true).

Configuration

Ensure the connector is configured with the appropriate environment variables for authentication and connectivity.

Output

  • Successful: Returns an RDF/XML file.
  • Failure: Returns error details (e.g., invalid DCAT format).

Workflow Example

1. Execute the exportCatalogDCAT action with dcat_ap_format=dcat_ap_ch.
2. Save the RDF/XML file.
3. Use the file for metadata integration with DCAT-compatible systems.

Catalog Facets Retrieval

Action

getDatasetsFacets

Purpose

Retrieves facet values for datasets to aid navigation, helping users filter datasets based on attributes like publisher.

Parameters

  • Required: None.
  • Optional:
    • facet: Specify facet to retrieve (e.g., publisher).
    • refine: Filter by facet (e.g., publisher:Information & eGovernment Authority).
    • exclude: Exclude facet values (e.g., publisher:Information & eGovernment Authority).
    • where: Filter datasets using ODSQL (e.g., publisher=”Information & eGovernment Authority”).
    • timezone: Timezone for datetime fields (e.g., Asia/Bahrain).

Configuration

 Ensure the connector is configured with the appropriate environment variables for authentication and connectivity

Output

  • Successful: Returns a JSON object with links and facets arrays.
  • Failure: Returns error details (e.g., invalid facet).

Workflow Example

1. Execute the getDatasetsFacets action with facet=publisher.
2. Review facet values (e.g., Information & eGovernment Authority) to refine dataset queries.
3. Use facets to filter results in subsequent actions.

Dataset Metadata Retrieval

Action

getDataset

Purpose

Retrieves detailed metadata for a specific dataset, including fields and endpoints to plan further data queries or exports.

Parameters

  • Required: dataset_id (e.g., population-statistics).
  • Optional:
    • select: Specify fields to include (e.g., dataset_id, fields).
    • lang: Language for formatting (e.g., en).
    • timezone: Timezone for datetime fields (e.g., Asia/Bahrain).
    • include_links: Include HATEOAS links (boolean, default: false).
    • include_app_metas: Include metadata (boolean, default: false).

Configuration

Ensure the connector is configured with the appropriate environment variables for authentication and connectivity.

Output

  • Successful: Returns a JSON object with dataset details (e.g., dataset_id, fields, metas).
  • Failure: Returns error details (e.g., invalid dataset_id).

Workflow Example

1. Use the getDatasets action to identify a dataset (e.g., population-statistics).
2. Execute the getDataset action with the selected dataset_id.
3. Review metadata to plan record queries or exports.

Dataset Records Retrieval

Action

getRecords

Purpose

Queries records from a specific dataset, allowing users to access detailed data entries for analysis or visualization.

Parameters

  • Required:dataset_id.
  • Optional:
    • select: Specify fields to include (e.g., dataset_id, fields).
    • where: Filter datasets using ODSQL (e.g., year=2020).
    • group_by: Group results (e.g., region).
    • order_by: Sort results (e.g., year desc).
    • limit: Number of items (default: 10, max: 100).
    • offset: Starting index (default: 0).
    • refine: Filter by facet (e.g., year:2020).
    • exclude: Exclude facet values (e.g., year:2020).
    • lang: Language for formatting (e.g., en).
    • timezone: Timezone for datetime fields (e.g., Asia/Bahrain).
    • include_links: Include HATEOAS links (boolean, default: false).
    • include_app_metas: Include metadata (boolean, default: false).

Configuration

Ensure the connector is configured with the appropriate environment variables for authentication and connectivity.

Output

  • Successful: Returns a JSON object with total_count, _links, and results (array of records).
  • Failure:Returns error details (e.g., invalid dataset_id).

Workflow Example

1. Use the getDataset action to select a dataset (e.g., population-statistics).
2. Execute the getRecords action with dataset_id=population-statistics and where=year:2020.
3. Process the response to display or analyze records.

Dataset Record Retrieval

Action

getRecord

Purpose

Retrieves a single record from a dataset, providing detailed information for a specific data entry.

Parameters

  • Required: dataset_id, record_id (e.g., pop_001).
  • Optional:
    • select: Specify fields to include (e.g., dataset_id, fields).
    • lang: Language for formatting (e.g., en).
    • timezone: Timezone for datetime fields (e.g., Asia/Bahrain).

Configuration

Ensure the connector is configured with the appropriate environment variables for authentication and connectivity.

Output

  • Successful: Returns a JSON object representing the record.
  • Failure: Returns error details (e.g., invalid record_id).

Workflow Example

1. Use the getRecords action to identify a record in population-statistics.
2. Execute the getRecord action with dataset_id=population-statistics and record_id=pop_001.
3. Review the record details for further processing.

Dataset Exports Listing

Action

listDatasetExportFormats

Purpose

Lists available export formats for a specific dataset, helping users select a format for dataset export.

Parameters

  • Required:dataset_id.
  • Optional:
    • None.

Configuration

Ensure the connector is configured with the appropriate environment variables for authentication and connectivity.

Output

  • Successful: Returns a JSON object with a links array listing export formats.
  • Failure: Returns error details (e.g., invalid dataset_id).

Workflow Example

1. Use the getDataset action to select a dataset (e.g., population-statistics).
2. Execute the listDatasetExportFormats action with dataset_id=population-statistics.
3. Review available formats (e.g., csv, parquet) for export.

Dataset Export by Format

Action

 exportRecords

Purpose

Exports a dataset in a user-specified format, such as CSV or GeoJSON, for diverse applications.

Parameters

  • Required:dataset_id, format (e.g., csv, geojson).
  • Optional:
    • select: Specify fields to include (e.g., dataset_id, fields).
    • where: Filter datasets using ODSQL (e.g., year=2020).
    • order_by: Sort results (e.g., year desc).
    • group_by: Group results (e.g., region).
    • limit_export: Number of items (default: -1, retrieves all records).
    • refine: Filter by facet (e.g., year:2020).
    • exclude: Exclude facet values (e.g., year:2020).
    • lang: Language for formatting (e.g., en).
    • timezone: Timezone for datetime fields (e.g., Asia/Bahrain).
    • use_labels: Use field labels (boolean, default: true).
    • epsg: Coordinate system for geospatial data (e.g., 4326).

Configuration

Ensure the connector is configured with the appropriate environment variables for authentication and connectivity.

Output

  • Successful: Returns a file in the specified format.
  • Failure: Returns error details (e.g., invalid format).

Workflow Example

1. Use the listDatasetExportFormats action to identify supported formats for population-statistics.
2. Execute the exportRecords action with dataset_id=population-statistics and format=csv.
3. Save the exported file for external use.

Dataset CSV Export

Action

exportRecordsCSV

Purpose

Exports a dataset in CSV format with customizable parameters, enabling tailored data extraction for specific needs.

Parameters

  • Required:dataset_id.
  • Optional:
    • delimiter: Field delimiter (e.g., ,, default: ;).
    • list_separator: Multivalued string separator (default: ,).
    • quote_all: Quote all strings (boolean, default: false).
    • with_bom: Include Unicode BOM (boolean, default: true).
    • select: Specify fields to include (e.g., dataset_id, fields).
    • where: Filter datasets using ODSQL (e.g., year=2020).
    • order_by: Sort results (e.g., year desc).
    • group_by: Group results (e.g., region).
    • limit_export: Number of items (default: -1, retrieves all records).
    • refine: Filter by facet (e.g., year:2020).
    • exclude: Exclude facet values (e.g., year:2020).
    • lang: Language for formatting (e.g., en).
    • timezone: Timezone for datetime fields (e.g., Asia/Bahrain).
    • use_labels: Use field labels (boolean, default: true).
    • epsg: Coordinate system for geospatial data (e.g., 4326).

Configuration

Ensure the connector is configured with the appropriate environment variables for authentication and connectivity.

Output

  • Successful:Returns a CSV file.
  • Failure:Returns error details (e.g., invalid parameters).

Workflow Example

1. Execute the exportRecordsCSV action with dataset_id=population-statistics and delimiter=,.
2. Save the CSV file containing dataset records.
3. Use the file for data analysis or reporting.

Dataset Parquet Export

Action

exportRecordsParquet

Purpose

Exports a dataset in Parquet format, ideal for efficient storage and processing in data analytics platforms.

Parameters

  • Required:dataset_id.
  • Optional:
    • parquet_compression: Compression type (snappy or zstd, default: snappy).
    • select: Specify fields to include (e.g., dataset_id, fields).
    • where: Filter datasets using ODSQL (e.g., year=2020).
    • order_by: Sort results (e.g., year desc).
    • group_by: Group results (e.g., region).
    • limit_export: Number of items (default: -1, retrieves all records).
    • refine: Filter by facet (e.g., year:2020).
    • exclude: Exclude facet values (e.g., year:2020).
    • lang: Language for formatting (e.g., en).
    • timezone: Timezone for datetime fields (e.g., Asia/Bahrain).
    • use_labels: Use field labels (boolean, default: true).
    • epsg: Coordinate system for geospatial data (e.g., 4326).

Configuration

Ensure the connector is configured with the appropriate environment variables for authentication and connectivity.

Output

  • Successful:Returns a Parquet file.
  • Failure:Returns error details (e.g., invalid dataset_id).

Workflow Example

1. Execute the exportRecordsParquet action with dataset_id=population-statistics.
2. Save the Parquet file.
3. Use the file for data processing in compatible systems.

Dataset GPX Export

Action

exportRecordsGPX

Purpose

Exports a dataset in GPX format, suitable for geographic data visualization and GPS applications.

Parameters

  • Required:dataset_id.
  • Optional:
    • name_field: Field for waypoint names (e.g., region).
    • description_field_list: Fields for waypoint descriptions (e.g., population).
    • use_extension: Include extensions in GPX (boolean, default: true).
    • select: Specify fields to include (e.g., dataset_id, fields).
    • where: Filter datasets using ODSQL (e.g., year=2020).
    • order_by: Sort results (e.g., year desc).
    • group_by: Group results (e.g., region).
    • limit_export: Number of items (default: -1, retrieves all records).
    • refine: Filter by facet (e.g., year:2020).
    • exclude: Exclude facet values (e.g., year:2020).
    • lang: Language for formatting (e.g., en).
    • timezone: Timezone for datetime fields (e.g., Asia/Bahrain).
    • use_labels: Use field labels (boolean, default: true).
    • epsg: Coordinate system for geospatial data (e.g., 4326).

Configuration

Ensure the connector is configured with the appropriate environment variables for authentication and connectivity.

Output

  • Successful:Returns a GPX file.
  • Failure:Returns error details (e.g., invalid parameters).

Workflow Example

1. Execute the exportRecordsGPX action with dataset_id=population-statistics and name_field=region.
2. Save the GPX file.
3. Use the file for geographic data visualization.

Dataset Facets Retrieval

Action

getRecordsFacets

Purpose

Retrieves facet values for records in a specific dataset, aiding in filtering and navigating dataset records effectively.

Parameters

  • Required:dataset_id.
  • Optional:
    • facet: Specify facet to retrieve (e.g., year).
    • where: Filter datasets using ODSQL (e.g., year=2020).
    • refine: Filter by facet (e.g., year:2020).
    • exclude: Exclude facet values (e.g., year:2020).
    • lang: Language for formatting (e.g., en).
    • timezone: Timezone for datetime fields (e.g., Asia/Bahrain).

Configuration

Ensure the connector is configured with the appropriate environment variables for authentication and connectivity.

Output

  • Successful:Returns a JSON object with links and facets arrays.
  • Failure:Returns error details (e.g., invalid facet).

Workflow Example

1. Use the getDataset action to select population-statistics.
2. Execute the getRecordsFacets action with dataset_id=population-statistics and facet=year.
2. Use facet values to refine record queries.

Dataset Attachments Retrieval

Action

getDatasetAttachments

Purpose

Retrieves files or attachments related to a specific dataset, providing access to supplementary data or resources.

Parameters

  • Required:dataset_id.
  • Optional:
    • None.

Configuration

 Ensure the connector is configured with the appropriate environment variables for authentication and connectivity.

Output

  • Successful:Returns a JSON object with links and attachments arrays (e.g., href, mime-type, title).
  • Failure:Returns error details (e.g., invalid dataset_id).

Workflow Example

1. Use the getDataset action to select population-statistics.
2. Execute the getDatasetAttachments action with dataset_id=population-statistics.
3. Download attachments (e.g., population_report.pdf) for further use.

Workflow Creation with the Connector

Example Workflow: Exploring and Exporting Population Statistics

Retrieve Catalog Datasets:

  • Use the getDatasets action to fetch a list of available datasets.
  • Identify the target dataset (e.g., population-statistics).

Refine Dataset Exploration:

Execute the getRecordsFacets action with dataset_id=population-statistics and facet=year to filter records.

Query Dataset Records:

Use the getRecords action with dataset_id=population-statistics and where=year:2020 to fetch relevant records.

Export Dataset Data:

  • Encode the dataset in CSV format using the exportRecordsCSV action with dataset_id=population-statistics.
  • Export the dataset in Parquet format via the exportRecordsParquet action for analytical processing.
  • Generate a GPX file for geographic visualization using the exportRecordsGPX action with name_field=region.

Retrieve Metadata and Attachments:

  • Fetch metadata for population-statistics using the getDataset action.
  • Download attachments (e.g., reports) via the getDatasetAttachments action.

Export Catalog Metadata

  • Export the catalog as CSV using the exportCatalogCSV action.
  • Generate DCAT-AP metadata via the exportCatalogDCAT action with dcat_ap_format=dcat_ap_ch.

Pricing

Request a Quote

Support

For Technical support please contact us on

custom-connectors-support@isolutions.sa

iSolution logo - white - transparent 250 px

iSolution logo - white - transparent 250 px

A tech solution company dedicated to providing innovation thus empowering businesses to thrive in the digital age.

  • Home
  • About us
  • Blog
  • Careers
  • Success Stories
  • News
  • Articles
  • Contact Us
  • Terms and conditions
  • Privacy Policy
© Copyright 2024 iSolution | All Rights Reserved
  • Home
  • About us
  • Technologies
    • Cloud Services
      • Google Cloud Platform
        • Networking
        • Compute
        • Storage
        • SAP on GCP
        • Google Maps
        • Data Center Modernization
    • Infrastructure
      • iSolution Services
      • Unified Communication
      • Network Security
      • Access Security & Control
      • Computing Platforms
      • Structured Cabling Infrastructure
      • Datacenter Infrastructure
      • Networking Infrastructure
      • Retail Analytics
      • Cloud Infrastructure
    • Integration
      • Apigee
      • Software AG
      • Custom Connectors
    • Security
      • Security Consulting Services
      • Security Solutions
    • Data & AI
      • BigQuery, Looker
      • Gemini
    • Collaboration Tools
      • Google Workspace For Enterprise
    • ERP-CRM
      • Odoo
      • Salesforce
      • SAP on GCP
    • DevOps
      • GCP
      • SonarSource
    • Managed Service Provider
      • Managed Service Provider
    • App Development
      • App Development
    • Open Banking
      • Open banking
    • Chrome Devices
  • Unplugged Podcast
  • Blog
    • Success Stories
    • News
    • Articles
  • Careers
  • Contact Us
Ismena website

Register to Sonar Dubai

Sonar Dubai

Register To The Future Fabric Event

Register to Gemini in Action Workshop

[forminator_form id=”14485″]

Registration To Amman Unplugged Event

[forminator_form id=”14419″]

Register to Gemini in Action Workshop

[forminator_form id=”14298″]

Tech and Culture Riyadh

[forminator_form id=”13094″]