Skip to main content
AlationSink

AlationSink

PROD
The connector will ingest data from OpenMetadata into Alation.
Configure and schedule Alation Sink metadata workflow from the OpenMetadata UI:

How to Run the Connector Externally

To run the Ingestion via the UI you’ll need to use the OpenMetadata Ingestion Container, which comes shipped with custom Airflow plugins to handle the workflow deployment. If, instead, you want to manage your workflows externally on your preferred orchestrator, you can check the following docs to run the Ingestion Framework anywhere.

Requirements

The connector uses POST requests to write the data into Alation. Hence, an user credentials or an access token with Source Admin or Catalog Admin or Server Admin permissions will be required. Follow the link here to create the access token.

Data Mapping and Assumptions

Following entities are supported and will be mapped to the from OpenMetadata to the entities in Alation.
Alation EntityOpenMetadata Entity
Data Source (OCF)Database
SchemaSchema
TableTable
ColumnsColumns

Metadata Ingestion

Then, prepare the Alation Sink Service and configure the Ingestion:
1

Visit the Services Page

Click `Settings` in the side navigation bar and then `Services`. The first step is to ingest the metadata from your sources. To do that, you first need to create a Service connection first. This Service will be the bridge between OpenMetadata and your source system. Once a Service is created, it can be used to configure your ingestion workflows.Visit Services Page
2

Create a New Service

Click on _Add New Service_ to start the Service creation.Create a new Service
3

Select the Service Type

Select AlationSink as the Service type and click _Next_.Select Service
4

Name and Describe your Service

Provide a name and description for your Service.

Service Name

OpenMetadata uniquely identifies Services by their **Service Name**. Provide a name that distinguishes your deployment from other Services, including the other AlationSink Services that you might be ingesting metadata from. Note that when the name is set, it cannot be changed.Add New Service
5

Configure the Service Connection

In this step, we will configure the connection settings required for AlationSink. Please follow the instructions below to properly configure the Service to read from your sources. You will also find helper documentation on the right-hand side panel in the UI.Configure Service connection

Connection Details

1

Connection Details

When using a Hybrid Ingestion Runner, any sensitive credential fields—such as passwords, API keys, or private keys—must reference secrets using the following format:
password: secret:/my/database/password
This applies only to fields marked as secrets in the connection form (these typically mask input and show a visibility toggle icon). For a complete guide on managing secrets in hybrid setups, see the Hybrid Ingestion Runner Secret Management Guide.
  • Host and Port: Host and port of the Alation service.
  • Authentication Types:
    1. Basic Authentication
    • Username: The name of the user whose credentials will be used to sign in.
    • Password: The password of the user.
    1. Access Token Authentication The access token created using the steps mentioned here can directly be entered. We’ll use that directly to authenticate the Alation APIs
    • accessToken: Generated access token
  • Project Name: Project name to create the refreshToken. Can be anything.
  • Pagination Limit: Pagination limit used for Alation APIs pagination
  • DataSource Links: Add a custom mapping between OpenMetadata databases and Alation DataSources. If this mapping is present the connector will only look for the datasource in Alation to create other entities inside it. It will not create the datasource in Alation and it’ll need to be created beforehand. The mapping needs to be of the format alation_datasource_id: openmetadata_database_fqn Here alation_datasource_id corresponds to the numerical id of the datasource in alation. And openmetadata_database_fqn corresponds to the fullyQualifiedName of the database in OpenMetadata. Below is an example of the mapping:
datasourceLinks: {
    "23": "sample_data.ecommerce_db",
    "15": "mysql_prod.customers_db",
}
2

Test the Connection

Once the credentials have been added, click on Test Connection and Save the changes.Test Connection
3

Configure Metadata Ingestion

In this step we will configure the metadata ingestion pipeline, Please follow the instructions belowConfigure Metadata Ingestion
4

Schedule the Ingestion and Deploy

Scheduling can be set up at an hourly, daily, weekly, or manual cadence. The timezone is in UTC. Select a Start Date to schedule for ingestion. It is optional to add an End Date.Review your configuration settings. If they match what you intended, click Deploy to create the service and schedule metadata ingestion.If something doesn’t look right, click the Back button to return to the appropriate step and change the settings as needed.After configuring the workflow, you can click on Deploy to create the pipeline.Schedule the Workflow
5

View the Ingestion Pipeline

Once the workflow has been successfully deployed, you can view the Ingestion Pipeline running from the Service Page.View Ingestion Pipeline
If AutoPilot is enabled, workflows like usage tracking, data lineage, and similar tasks will be handled automatically. Users don’t need to set up or manage them - AutoPilot takes care of everything in the system.