Ready to get started?

Learn more about CData Connect Cloud or sign up for free trial access:

Free Trial

Connect to Live Azure Data Lake Storage Data in PostGresSQL Interface through CData Connect Cloud



Create a live connection to Azure Data Lake Storage in CData Connect Cloud and connect to your Azure Data Lake Storage data from PostgreSQL.

There are a vast number of PostgreSQL clients available on the Internet. PostgreSQL is a popular interface for data access. When you pair PostgreSQL with CData Connect Cloud, you gain database-like access to live Azure Data Lake Storage data from PostgreSQL. In this article, we walk through the process of connecting to Azure Data Lake Storage data in Connect Cloud and establishing a connection between Connect Cloud and PostgreSQL using a TDS foreign data wrapper (FDW).

CData Connect Cloud provides a pure SQL Server interface for Azure Data Lake Storage, allowing you to query data from Azure Data Lake Storage without replicating the data to a natively supported database. Using optimized data processing out of the box, CData Connect Cloud pushes all supported SQL operations (filters, JOINs, etc.) directly to Azure Data Lake Storage, leveraging server-side processing to return the requested Azure Data Lake Storage data quickly.

Connect to Azure Data Lake Storage in Connect Cloud

CData Connect Cloud uses a straightforward, point-and-click interface to connect to data sources.

  1. Log into Connect Cloud, click Connections and click Add Connection
  2. Adding a Connection
  3. Select "Azure Data Lake Storage" from the Add Connection panel
  4. Selecting a data source
  5. Enter the necessary authentication properties to connect to Azure Data Lake Storage.

    Authenticating to a Gen 1 DataLakeStore Account

    Gen 1 uses OAuth 2.0 in Azure AD for authentication.

    For this, an Active Directory web application is required. You can create one as follows:

    1. Sign in to your Azure Account through the .
    2. Select "Azure Active Directory".
    3. Select "App registrations".
    4. Select "New application registration".
    5. Provide a name and URL for the application. Select Web app for the type of application you want to create.
    6. Select "Required permissions" and change the required permissions for this app. At a minimum, "Azure Data Lake" and "Windows Azure Service Management API" are required.
    7. Select "Key" and generate a new key. Add a description, a duration, and take note of the generated key. You won't be able to see it again.

    To authenticate against a Gen 1 DataLakeStore account, the following properties are required:

    • Schema: Set this to ADLSGen1.
    • Account: Set this to the name of the account.
    • OAuthClientId: Set this to the application Id of the app you created.
    • OAuthClientSecret: Set this to the key generated for the app you created.
    • TenantId: Set this to the tenant Id. See the property for more information on how to acquire this.
    • Directory: Set this to the path which will be used to store the replicated file. If not specified, the root directory will be used.

    Authenticating to a Gen 2 DataLakeStore Account

    To authenticate against a Gen 2 DataLakeStore account, the following properties are required:

    • Schema: Set this to ADLSGen2.
    • Account: Set this to the name of the account.
    • FileSystem: Set this to the file system which will be used for this account.
    • AccessKey: Set this to the access key which will be used to authenticate the calls to the API. See the property for more information on how to acquire this.
    • Directory: Set this to the path which will be used to store the replicated file. If not specified, the root directory will be used.
    Configuring a connection (Salesforce is shown)
  6. Click Create & Test
  7. Navigate to the Permissions tab in the Add Azure Data Lake Storage Connection page and update the User-based permissions. Updating permissions

Add a Personal Access Token

If you are connecting from a service, application, platform, or framework that does not support OAuth authentication, you can create a Personal Access Token (PAT) to use for authentication. Best practices would dictate that you create a separate PAT for each service, to maintain granularity of access.

  1. Click on your username at the top right of the Connect Cloud app and click User Profile.
  2. On the User Profile page, scroll down to the Personal Access Tokens section and click Create PAT.
  3. Give your PAT a name and click Create.
  4. Creating a new PAT
  5. The personal access token is only visible at creation, so be sure to copy it and store it securely for future use.

Build the TDS Foreign Data Wrapper

The Foreign Data Wrapper can be installed as an extension to PostgreSQL, without recompiling PostgreSQL. The tds_fdw extension is used as an example (https://github.com/tds-fdw/tds_fdw).

  1. You can clone and build the git repository via something like the following view source: sudo apt-get install git git clone https://github.com/tds-fdw/tds_fdw.git cd tds_fdw make USE_PGXS=1 sudo make USE_PGXS=1 install Note: If you have several PostgreSQL versions and you do not want to build for the default one, first locate where the binary for pg_config is, take note of the full path, and then append PG_CONFIG= after USE_PGXS=1 at the make commands.
  2. After you finish the installation, then start the server: sudo service postgresql start
  3. Then go inside the Postgres database psql -h localhost -U postgres -d postgres Note: Instead of localhost you can put the IP where your PostgreSQL is hosted.

Connect to Azure Data Lake Storage data as a PostgreSQL Database and query the data!

After you have installed the extension, follow the steps below to start executing queries to Azure Data Lake Storage data:

  1. Log into your database.
  2. Load the extension for the database: CREATE EXTENSION tds_fdw;
  3. Create a server object for Azure Data Lake Storage data: CREATE SERVER "ADLS1" FOREIGN DATA WRAPPER tds_fdw OPTIONS (servername'tds.cdata.com', port '14333', database 'ADLS1');
  4. Configure user mapping with your email and Personal Access Token from your Connect Cloud account: CREATE USER MAPPING for postgres SERVER "ADLS1" OPTIONS (username 'username@cdata.com', password 'your_personal_access_token' );
  5. Create the local schema: CREATE SCHEMA "ADLS1";
  6. Create a foreign table in your local database: #Using a table_name definition: CREATE FOREIGN TABLE "ADLS1".Resources ( id varchar, Permission varchar) SERVER "ADLS1" OPTIONS(table_name 'ADLS.Resources', row_estimate_method 'showplan_all'); #Or using a schema_name and table_name definition: CREATE FOREIGN TABLE "ADLS1".Resources ( id varchar, Permission varchar) SERVER "ADLS1" OPTIONS (schema_name 'ADLS', table_name 'Resources', row_estimate_method 'showplan_all'); #Or using a query definition: CREATE FOREIGN TABLE "ADLS1".Resources ( id varchar, Permission varchar) SERVER "ADLS1" OPTIONS (query 'SELECT * FROM ADLS.Resources', row_estimate_method 'showplan_all'); #Or setting a remote column name: CREATE FOREIGN TABLE "ADLS1".Resources ( id varchar, col2 varchar OPTIONS (column_name 'Permission')) SERVER "ADLS1" OPTIONS (schema_name 'ADLS', table_name 'Resources', row_estimate_method 'showplan_all');
  7. You can now execute read/write commands to Azure Data Lake Storage: SELECT id, Permission FROM "ADLS1".Resources;

More Information & Free Trial

Now, you have created a simple query from live Azure Data Lake Storage data. For more information on connecting to Azure Data Lake Storage (and more than 100 other data sources), visit the Connect Cloud page. Sign up for a free trial and start working with live Azure Data Lake Storage data in PostgreSQL.