{"id":19923,"date":"2025-01-14T19:09:10","date_gmt":"2025-01-14T13:39:10","guid":{"rendered":"https:\/\/opstree.com\/blog\/?p=19923"},"modified":"2025-07-23T11:46:21","modified_gmt":"2025-07-23T06:16:21","slug":"part-2-automating-data-migration-with-apache-airflow","status":"publish","type":"post","link":"https:\/\/opstree.com\/blog\/2025\/01\/14\/part-2-automating-data-migration-with-apache-airflow\/","title":{"rendered":"Automating Data Migration Using Apache Airflow: A Step-by-Step Guide"},"content":{"rendered":"\r\n<p>In this second part of our blog, we\u2019ll walk through how we automated the migration process using Apache Airflow. We\u2019ll cover everything from unloading data from Amazon Redshift to S3, transferring it to Google Cloud Storage (GCS), and finally loading it into Google BigQuery. This comprehensive process was orchestrated with Airflow to make sure every step was executed smoothly, automatically, and without error.<\/p>\r\n<p><!--more--><\/p>\r\n<h2>Step 1: Setting Up Apache Airflow<\/h2>\r\n<p>Before diving into the migration tasks, we first need to ensure that Apache Airflow is properly set up. Here\u2019s how I set it up for my project.<\/p>\r\n<p>1. Install Apache Airflow: If you don&#8217;t have Airflow installed, use the following command to install it via pip:<\/p>\r\n\r\n\r\n\r\n<pre class=\"wp-block-code\"><code>pip                      install                              apache-airflow<\/code><\/pre>\r\n\r\n\r\n\r\n<p>2. <strong>Initialize the Airflow Database:<\/strong> Airflow requires a backend database to track tasks and maintain state. To initialize it, run:<\/p>\r\n\r\n\r\n\r\n<pre class=\"wp-block-code\"><code>airflow db init<\/code><\/pre>\r\n\r\n\r\n\r\n<p>3. <strong>Start the Web Server and Scheduler:<\/strong> Once Airflow is initialized, start the web server (for monitoring) and scheduler (to run tasks):<\/p>\r\n\r\n\r\n\r\n<pre class=\"wp-block-code\"><code>airflow webserver --port 8080 airflow scheduler<\/code><\/pre>\r\n\r\n\r\n\r\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-19925\" src=\"https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-185442-1024x495.png\" alt=\"\" width=\"620\" height=\"300\" srcset=\"https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-185442-1024x495.png 1024w, https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-185442-300x145.png 300w, https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-185442-768x371.png 768w, https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-185442-1200x580.png 1200w, https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-185442.png 1243w\" sizes=\"(max-width: 709px) 85vw, (max-width: 909px) 67vw, (max-width: 984px) 61vw, (max-width: 1362px) 45vw, 600px\" \/><\/figure>\r\n\r\n\r\n\r\n<ol start=\"4\">\r\n<li><strong>Set Up Connections in Airflow<\/strong>:\u00a0<\/li>\r\n<\/ol>\r\n\r\n\r\n\r\n<ol start=\"1\">\r\n<li><strong>Redshift<\/strong>: Create a connection in the Airflow UI under <strong>Admin \u2192 Connections<\/strong> for Redshift (with your host, database, username, and password).\u00a0<\/li>\r\n<\/ol>\r\n\r\n\r\n\r\n<ol start=\"2\">\r\n<li><strong>AWS<\/strong>: <a href=\"https:\/\/opstree.com\/blog\/2021\/11\/16\/aws-secret-manager\/\">Set up an AWS connection<\/a> with your <strong>AWS Access Key<\/strong> and <strong>Secret Access Key<\/strong>.\u00a0<\/li>\r\n<\/ol>\r\n\r\n\r\n\r\n<ol start=\"3\">\r\n<li><strong>Google Cloud<\/strong>: Set up a connection for <strong>Google Cloud<\/strong> using a service account (we&#8217;ll go into this below).\u00a0<\/li>\r\n<\/ol>\r\n<p><strong>[ Also Read Part 1: <a href=\"https:\/\/opstree.com\/blog\/2025\/01\/07\/a-step-into-the-world-of-data-mastery-optimizing-redshift-for-seamless-migration\/\">How to Optimize Amazon Redshift for Faster and Seamless Data Migration<\/a>]<\/strong><\/p>\r\n\r\n\r\n\r\n<h2 class=\"wp-block-heading\"><strong>Step 2: Setting Up Google Cloud Resources<\/strong>\u00a0<\/h2>\r\n\r\n\r\n\r\n<p>Before we can transfer data from S3 to Google Cloud Storage (GCS) and eventually to BigQuery, we need to configure the necessary resources in <strong>Google Cloud<\/strong>.\u00a0<\/p>\r\n\r\n\r\n\r\n<p><strong>Create a Google Cloud Project<\/strong>\u00a0<\/p>\r\n\r\n\r\n\r\n<ol start=\"1\">\r\n<li>Go to the <strong>Google Cloud Console<\/strong>.\u00a0<\/li>\r\n<\/ol>\r\n\r\n\r\n\r\n<ol start=\"2\">\r\n<li>Create a new project (e.g., migration-project).\u00a0<br \/><br \/><strong>Enable the Necessary APIs<\/strong>\u00a0<\/li>\r\n<\/ol>\r\n\r\n\r\n\r\n<p>To interact with Google Cloud services, you need to enable the following APIs:\u00a0<\/p>\r\n\r\n\r\n\r\n<ul>\r\n<li><strong>Google Cloud Storage API<\/strong>\u00a0<\/li>\r\n<\/ul>\r\n\r\n\r\n\r\n<ul>\r\n<li><strong>BigQuery API<\/strong>\u00a0<\/li>\r\n<\/ul>\r\n\r\n\r\n\r\n<p>Go to the <strong>APIs &amp; Services \u2192 Library<\/strong>, search for these APIs, and enable them.\u00a0<\/p>\r\n\r\n\r\n\r\n<p><strong>Create a Service Account<\/strong>\u00a0<\/p>\r\n\r\n\r\n\r\n<p>Next, you\u2019ll need to create a service account with the appropriate permissions:\u00a0<\/p>\r\n\r\n\r\n\r\n<ol start=\"1\">\r\n<li>Go to the <strong>IAM &amp; Admin \u2192 Service Accounts<\/strong> section in Google Cloud.\u00a0<\/li>\r\n<\/ol>\r\n\r\n\r\n\r\n<ol start=\"2\">\r\n<li>Click <strong>Create Service Account<\/strong>, give it a name, and provide the following roles:\u00a0<\/li>\r\n<\/ol>\r\n\r\n\r\n\r\n<ul>\r\n<li><strong>Storage Admin<\/strong> (for access to Google Cloud Storage)\u00a0<\/li>\r\n<\/ul>\r\n\r\n\r\n\r\n<ul>\r\n<li><strong>BigQuery Data Editor<\/strong> (for BigQuery access)\u00a0<\/li>\r\n<\/ul>\r\n\r\n\r\n\r\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-19926\" src=\"https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-185810-1024x562.png\" alt=\"\" width=\"547\" height=\"300\" srcset=\"https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-185810-1024x562.png 1024w, https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-185810-300x165.png 300w, https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-185810-768x421.png 768w, https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-185810.png 1096w\" sizes=\"(max-width: 547px) 85vw, 547px\" \/><\/figure>\r\n\r\n\r\n\r\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-19927\" src=\"https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-185836-1024x566.png\" alt=\"\" width=\"543\" height=\"300\" srcset=\"https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-185836-1024x566.png 1024w, https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-185836-300x166.png 300w, https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-185836-768x424.png 768w, https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-185836.png 1088w\" sizes=\"(max-width: 543px) 85vw, 543px\" \/><\/figure>\r\n\r\n\r\n\r\n<p>&nbsp;<\/p>\r\n\r\n\r\n\r\n<p>After creating the service account, you&#8217;ll be prompted to generate a key for it:\u00a0<\/p>\r\n\r\n\r\n\r\n<ul>\r\n<li>Select <strong>Create Key<\/strong> and choose the <strong>JSON<\/strong> format.\u00a0<\/li>\r\n<\/ul>\r\n\r\n\r\n\r\n<ul>\r\n<li>Download the key file and save it securely.\u00a0<\/li>\r\n<\/ul>\r\n\r\n\r\n\r\n<p>This key file will be used to authenticate <strong>Airflow<\/strong> with <strong>Google Cloud<\/strong>.\u00a0<\/p>\r\n\r\n\r\n\r\n<p><strong><a title=\"Set Up IAM\" href=\"https:\/\/opstree.com\/blog\/2023\/10\/10\/exploring-the-power-of-iam-roles-anywhere\/\">Set Up IAM<\/a> Roles<\/strong>\u00a0<\/p>\r\n\r\n\r\n\r\n<p>Make sure to assign these roles to the service account to ensure it has access to perform actions on <strong>GCS<\/strong> and <strong>BigQuery<\/strong>.\u00a0<\/p>\r\n\r\n\r\n\r\n<h2 class=\"wp-block-heading\"><strong>Step 3: Unloading Data from Redshift to S3 with Airflow<\/strong>\u00a0<\/h2>\r\n\r\n\r\n\r\n<p>Once the resources in Google Cloud were ready, the first step was to unload data from <strong>Amazon Redshift<\/strong> into <a href=\"https:\/\/opstree.com\/blog\/2024\/11\/05\/amazon-s3-security-essentials-protect-your-data-with-these-key-practices\/\"><strong>Amazon S3<\/strong><\/a>. I used the <strong>PostgresOperator<\/strong> in Airflow to execute a Redshift UNLOAD command.\u00a0<\/p>\r\n\r\n\r\n\r\n<p>Here\u2019s the Airflow DAG (dag1.py) for unloading data from Redshift:\u00a0<\/p>\r\n\r\n\r\n\r\n<p><a href=\"https:\/\/github.com\/ramneek2109\/DataMigration\/blob\/main\/dag1.py\" target=\"_blank\" rel=\"noopener\">https:\/\/github.com\/ramneek2109\/DataMigration\/blob\/main\/dag1.py<\/a><\/p>\r\n\r\n\r\n\r\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-19928\" src=\"https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-190246-1024x562.png\" alt=\"\" width=\"547\" height=\"300\" srcset=\"https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-190246-1024x562.png 1024w, https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-190246-300x165.png 300w, https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-190246-768x421.png 768w, https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-190246.png 1092w\" sizes=\"(max-width: 547px) 85vw, 547px\" \/><\/figure>\r\n\r\n\r\n\r\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-19929\" src=\"https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-190313-1024x526.png\" alt=\"\" width=\"584\" height=\"300\" srcset=\"https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-190313-1024x526.png 1024w, https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-190313-300x154.png 300w, https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-190313-768x395.png 768w, https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-190313.png 1096w\" sizes=\"(max-width: 584px) 85vw, 584px\" \/><\/figure>\r\n\r\n\r\n\r\n<p>&nbsp;<\/p>\r\n\r\n\r\n\r\n<p><strong>How it works:<\/strong>\u00a0<\/p>\r\n\r\n\r\n\r\n<ul>\r\n<li>The <strong>PostgresOperator<\/strong> runs a Redshift UNLOAD query that transfers data from the users table to <strong>Amazon S3<\/strong>.\u00a0<\/li>\r\n<\/ul>\r\n\r\n\r\n\r\n<ul>\r\n<li>The <strong>CREDENTIALS<\/strong> part references the IAM role for accessing S3.\u00a0<\/li>\r\n<\/ul>\r\n\r\n\r\n\r\n<h2 class=\"wp-block-heading\">Step 4: Transferring Data from S3 to Google Cloud Storage (GCS)\u00a0<\/h2>\r\n\r\n\r\n\r\n<p>After the data was unloaded to S3, the next step was to move it to <strong>Google Cloud Storage<\/strong> (GCS). This was done using the <strong>S3ToGCSOperator<\/strong> in Airflow.\u00a0<\/p>\r\n\r\n\r\n\r\n<p>Here\u2019s the Airflow DAG (dag2.py) for transferring data from S3 to GCS:\u00a0<\/p>\r\n\r\n\r\n\r\n<p><a href=\"https:\/\/github.com\/ramneek2109\/DataMigration\/blob\/main\/dag2.py\" target=\"_blank\" rel=\"noopener\">https:\/\/github.com\/ramneek2109\/DataMigration\/blob\/main\/dag2.py<\/a><\/p>\r\n\r\n\r\n\r\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-19930\" src=\"https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-190405-1024x566.png\" alt=\"\" width=\"543\" height=\"300\" srcset=\"https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-190405-1024x566.png 1024w, https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-190405-300x166.png 300w, https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-190405-768x424.png 768w, https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-190405.png 1093w\" sizes=\"(max-width: 543px) 85vw, 543px\" \/><\/figure>\r\n\r\n\r\n\r\n<p>&nbsp;<\/p>\r\n\r\n\r\n\r\n<p><strong>How it works:<\/strong>\u00a0<\/p>\r\n\r\n\r\n\r\n<ul>\r\n<li><strong>S3ToGCSOperator<\/strong> transfers files from <strong>Amazon S3<\/strong> to <strong>Google Cloud Storage (GCS)<\/strong>.\u00a0<\/li>\r\n<\/ul>\r\n\r\n\r\n\r\n<ul>\r\n<li>The <strong>list_s3_files<\/strong> task ensures that we know what files are being transferred.\u00a0<\/li>\r\n<\/ul>\r\n\r\n\r\n\r\n<h2 class=\"wp-block-heading\"><strong>Step 5: Loading Data from GCS to BigQuery<\/strong>\u00a0<\/h2>\r\n\r\n\r\n\r\n<p>After transferring the data to <strong>Google Cloud Storage (GCS)<\/strong>, the final step in the pipeline was to load the data into <strong>BigQuery<\/strong>. Instead of loading the entire dataset sequentially, we used <strong>parallel task execution<\/strong> in Airflow to improve performance and reduce load time. Here&#8217;s how we achieved this:\u00a0<\/p>\r\n\r\n\r\n\r\n<p><strong><em>Dynamic Task Creation for Parallelism<\/em><\/strong>\u00a0<\/p>\r\n\r\n\r\n\r\n<p>Instead of writing a static Airflow task for loading data from GCS to BigQuery, we leveraged Airflow&#8217;s ability to dynamically create tasks for each file in the GCS bucket. By doing so:\u00a0<\/p>\r\n\r\n\r\n\r\n<ul>\r\n<li>Each file was processed as an independent task.\u00a0<\/li>\r\n<\/ul>\r\n\r\n\r\n\r\n<ul>\r\n<li>These tasks ran in parallel, utilizing the underlying infrastructure to maximize throughput.\u00a0<\/li>\r\n<\/ul>\r\n\r\n\r\n\r\n<p>This approach works exceptionally well for large datasets split into multiple files because it allows BigQuery to ingest data efficiently across multiple streams.\u00a0<\/p>\r\n\r\n\r\n\r\n<p>Here\u2019s the Airflow DAG (dag3.py) for loading data into BigQuery:\u00a0<\/p>\r\n\r\n\r\n\r\n<p><a href=\"https:\/\/github.com\/ramneek2109\/DataMigration\/blob\/main\/dag3.py\" target=\"_blank\" rel=\"noopener\">https:\/\/github.com\/ramneek2109\/DataMigration\/blob\/main\/dag3.py<\/a><\/p>\r\n\r\n\r\n\r\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-19931\" src=\"https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-190504-1024x498.png\" alt=\"\" width=\"617\" height=\"300\" srcset=\"https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-190504-1024x498.png 1024w, https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-190504-300x146.png 300w, https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-190504-768x373.png 768w, https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-190504.png 1096w\" sizes=\"(max-width: 709px) 85vw, (max-width: 909px) 67vw, (max-width: 984px) 61vw, (max-width: 1362px) 45vw, 600px\" \/><\/figure>\r\n\r\n\r\n\r\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-19932\" src=\"https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-190525-1024x618.png\" alt=\"\" width=\"497\" height=\"300\" srcset=\"https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-190525-1024x618.png 1024w, https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-190525-300x181.png 300w, https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-190525-768x464.png 768w, https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-14-190525.png 1100w\" sizes=\"(max-width: 497px) 85vw, 497px\" \/><\/figure>\r\n\r\n\r\n\r\n<p>&nbsp;<\/p>\r\n\r\n\r\n\r\n<p><strong>How it works:<\/strong>\u00a0<\/p>\r\n\r\n\r\n\r\n<ul>\r\n<li><strong>GCSToBigQueryOperator<\/strong> loads the files from <strong>Google Cloud Storage<\/strong> into <strong>BigQuery<\/strong>, using the schema I defined.\u00a0<\/li>\r\n<\/ul>\r\n\r\n\r\n\r\n<ul>\r\n<li>The table is <strong>partitioned by created_at<\/strong> and <strong>clustered by user_id and last_login<\/strong> for fast and efficient querying.\u00a0<\/li>\r\n<\/ul>\r\n\r\n\r\n\r\n<h2 class=\"wp-block-heading\"><strong>Wrapping Up<\/strong>\u00a0<\/h2>\r\n\r\n\r\n\r\n<p>With the combination of <strong>Redshift<\/strong>, <strong>Airflow<\/strong>, <strong>S3<\/strong>, <strong>Google Cloud Storage<\/strong>, and <strong>BigQuery<\/strong>, we successfully automated our data migration pipeline. By orchestrating the entire process with Airflow, we ensured that the data migrated efficiently, without a hitch, and in a way that optimizes performance in BigQuery.\u00a0<\/p>\r\n\r\n\r\n\r\n<p>I hope this guide provides valuable insights for your next data migration project. Feel free to reach out with any questions or feedback!\u00a0<\/p>\r\n\r\n\r\n","protected":false},"excerpt":{"rendered":"<p>In this second part of our blog, we\u2019ll walk through how we automated the migration process using Apache Airflow. We\u2019ll cover everything from unloading data from Amazon Redshift to S3, transferring it to Google Cloud Storage (GCS), and finally loading it into Google BigQuery. This comprehensive process was orchestrated with Airflow to make sure every &hellip; <a href=\"https:\/\/opstree.com\/blog\/2025\/01\/14\/part-2-automating-data-migration-with-apache-airflow\/\" class=\"more-link\">Continue reading<span class=\"screen-reader-text\"> &#8220;Automating Data Migration Using Apache Airflow: A Step-by-Step Guide&#8221;<\/span><\/a><\/p>\n","protected":false},"author":244582684,"featured_media":19933,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_coblocks_attr":"","_coblocks_dimensions":"","_coblocks_responsive_height":"","_coblocks_accordion_ie_support":"","jetpack_post_was_ever_published":false,"_jetpack_newsletter_access":"","_jetpack_dont_email_post_to_subs":false,"_jetpack_newsletter_tier_id":0,"_jetpack_memberships_contains_paywalled_content":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":true,"jetpack_social_options":{"image_generator_settings":{"template":"highway","enabled":false},"version":2}},"categories":[36349927],"tags":[768739440,768739294,768739372,4996032,768739407],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/01\/Automating-Data-Migration-with-Apache-Airflow.png","jetpack_likes_enabled":false,"jetpack_sharing_enabled":true,"jetpack_shortlink":"https:\/\/wp.me\/pfDBOm-5bl","jetpack-related-posts":[],"_links":{"self":[{"href":"https:\/\/opstree.com\/blog\/wp-json\/wp\/v2\/posts\/19923"}],"collection":[{"href":"https:\/\/opstree.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/opstree.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/opstree.com\/blog\/wp-json\/wp\/v2\/users\/244582684"}],"replies":[{"embeddable":true,"href":"https:\/\/opstree.com\/blog\/wp-json\/wp\/v2\/comments?post=19923"}],"version-history":[{"count":8,"href":"https:\/\/opstree.com\/blog\/wp-json\/wp\/v2\/posts\/19923\/revisions"}],"predecessor-version":[{"id":29434,"href":"https:\/\/opstree.com\/blog\/wp-json\/wp\/v2\/posts\/19923\/revisions\/29434"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/opstree.com\/blog\/wp-json\/wp\/v2\/media\/19933"}],"wp:attachment":[{"href":"https:\/\/opstree.com\/blog\/wp-json\/wp\/v2\/media?parent=19923"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/opstree.com\/blog\/wp-json\/wp\/v2\/categories?post=19923"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/opstree.com\/blog\/wp-json\/wp\/v2\/tags?post=19923"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}