Extract Metadata from GCP Composer
Requirements
This approach has been last tested against:- Composer version 2.5.4
- Airflow version 2.6.3
openmetadata-ingestion==1.2.4.3.
There are 2 main approaches we can follow here to extract metadata from GCS. Both of them involve creating a DAG
directly in your Composer instance, but the requirements and the steps to follow are going to be slightly different.
Feel free to choose whatever approach adapts best to your current architecture and constraints.
Using the Python Operator
The most comfortable way to extract metadata out of GCP Composer is by directly creating a DAG in there that will handle the connection to the metadata database automatically and push the contents to your OpenMetadata server. The drawback here? You need to installopenmetadata-ingestion directly on the host. This might have some
incompatibilities with your current Python environment and/or the internal (and changing) Composer requirements.
In any case, once the requirements are there, preparing the DAG is super straight-forward.
Install the Requirements
In your environment you will need to install the following packages:openmetadata-ingestion==x.y.z, (e.g.,openmetadata-ingestion==1.2.4).sqlalchemy==1.4.27: This is needed to align OpenMetadata version with the Composer internal requirements.
openmetadata-ingestion version that matches the server version
you currently have!
Prepare the DAG!
Note that this DAG is a usual connector DAG, just using the Airflow service with theBackend connection.
As an example of a DAG pushing data to OpenMetadata under Google SSO, we could have:
Using the Kubernetes Pod Operator
In this second approach we won’t need to install absolutely anything to the GCP Composer environment. Instead, we will rely on theKubernetesPodOperator to use the underlying k8s cluster of Composer.
Then, the code won’t directly run using the hosts’ environment, but rather inside a container that we created
with only the openmetadata-ingestion package.
Requirements
The only thing we need to handle here is getting the URL of the underlying Composer’s database. You can follow the official GCS docs for the steps to obtain the credentials. In a nutshell, from the Airflow UI you can to Admin > Configurations, and search forsql_alchemy_conn. In our case,
the URL looked like this:
Prepare the DAG!
Kubernetes Pod Operator
You can name the task as you want (task_id and name). The important points here are the cmds, this should not
be changed, and the env_vars. The main.py script that gets shipped within the image will load the env vars
as they are shown, so only modify the content of the config YAML, but not this dictionary.
Note that the example uses the image openmetadata/ingestion-base:0.13.2. Update that accordingly for higher version
once they are released. Also, the image version should be aligned with your OpenMetadata server version to avoid
incompatibilities.
KubernetesPodOperator and how to tune its configurations
here.