Skip to main content

Troubleshooting

Workflow Deployment Error

If there were any errors during the workflow deployment process, the Ingestion Pipeline Entity will still be created, but no workflow will be present in the Ingestion container.
  • You can then Edit the Ingestion Pipeline and Deploy it again.
  • From the Connection tab, you can also Edit the Service if needed.

Connector Debug Troubleshooting

This section provides instructions to help resolve common issues encountered during connector setup and metadata ingestion in OpenMetadata. Below are some of the most frequently observed troubleshooting scenarios.

How to Enable Debug Logging for Any Ingestion

To enable debug logging for any ingestion workflow in OpenMetadata:
  1. Navigate to Services Go to Settings > Services > Service Type (e.g., Database) in the OpenMetadata UI.
  2. Select a Service Choose the specific service for which you want to enable debug logging.
  3. Access Ingestion Tab Go to the Ingestion tab and click the three-dot menu on the right-hand side of the ingestion type, and select Edit.
  4. Enable Debug Logging In the configuration dialog, enable the Debug Log option and click Next.
  5. Schedule and Submit Configure the schedule if needed and click Submit to apply the changes.

Permission Issues

If you encounter permission-related errors during connector setup or metadata ingestion, ensure that all the prerequisites and access configurations specified for each connector are properly implemented. Refer to the connector-specific documentation to verify the required permissions.

Unity Catalog connection details

source:
  type: unitycatalog
  serviceName: local_unity_catalog
  serviceConnection:
    config:
      catalog: hive_metastore
      databaseSchema: default
      token: <databricks token>
      hostPort: localhost:443
      connectionArguments:
        http_path: <http path of databricks cluster>
  sourceConfig:
    config:
      type: DatabaseMetadata
sink:
  type: metadata-rest
  config: {}
workflowConfig:
  openMetadataServerConfig:
    hostPort: http://localhost:8585/api
    authProvider: no-auth
Here are the steps to get hostPort, token and http_path. First login to Azure Databricks and from side bar select SQL Warehouse (In SQL section) Select Sql Warehouse Now click on sql Warehouse from the SQL Warehouses list. Open Sql Warehouse Now inside that page go to Connection details section. In this page Server hostname and Port is your hostPort, HTTP path is your http_path. Connection details In Connection details section page click on Create a personal access token. Open create token Now In this page you can create new token. Generate token