Qubole Data Service Logo
latest
  • Getting Started
  • Release Information
  • How-To (QDS Guides, Tasks, and FAQs)
    • User Guide
    • Qubole Product Suite
    • Administration Guide
    • FAQs
      • New User FAQs
      • General Questions
      • Questions about Airflow
        • 1. Do I need to provide access to Qubole while registering Airflow datastore in QDS?
        • 2. What does this error - Data store was created successfully but it could not be activated mean?
        • 3. How do I put the AUTH_TOKEN into the Qubole Default connection?
        • 4. How are API token and custom default data store related on Airflow Clusters?
        • 5. Is there any button to run a DAG on Airflow?
        • 6. Can I create a configuration to externally trigger an Airflow DAG?
        • 7. Why must I reenter the database password/AUTH-token at a cluster restart?
        • 8. Questions on Airflow Service Issues
        • 9. Deleting a DAG on an Airflow Cluster
      • Questions about Hive
      • Questions about QDS Clusters
      • Questions about GCP
      • Questions about Security
      • Questions about Package Management
  • Qubole Product Suite
  • Connectivity Options
  • REST API Reference
  • Troubleshooting Guide
  • How QDS Works
  • Additional Resources
  • Providing Feedback
Qubole Data Service
  • Docs »
  • How-To (QDS Guides, Tasks, and FAQs) »
  • FAQs »
  • Questions about Airflow »
  • 4. How are API token and custom default data store related on Airflow Clusters?
  • Edit on Bitbucket

4. How are API token and custom default data store related on Airflow Clusters?ΒΆ

You need API token to submit an Airflow command and the data store ID to register it on QDS. When Airflow submits commands, it internally gets the data store information.

Feedback | Try Free Trial
Next Previous

© Copyright 2021, Qubole.

Read the Docs v: latest
Versions
latest
Downloads
html
epub
On Read the Docs
Project Home
Builds

Free document hosting provided by Read the Docs.