Ready to get started?

Download a free trial of the Azure Data Lake Storage ODBC Driver to get started:

 Download Now

Learn more:

Azure Data Lake Storage Icon Azure Data Lake Storage ODBC Driver

The Azure Data Lake Storage ODBC Driver is a powerful tool that allows you to connect with live data from Azure Data Lake Storage, directly from any applications that support ODBC connectivity.

Access Azure Data Lake Storage data like you would a database - read, write, and update Azure Data Lake Storage ADLSData, etc. through a standard ODBC Driver interface.

How to Connect to Azure Data Lake Storage Data in Python on Linux/UNIX



Create Python applications on Linux/UNIX machines with connectivity to Azure Data Lake Storage data. Leverage the pyodbc module for ODBC in Python.

The rich ecosystem of Python modules lets you get to work quicker and integrate your systems more effectively. With the CData Linux/UNIX ODBC Driver for Azure Data Lake Storage and the pyodbc module, you can easily build Azure Data Lake Storage-connected Python applications. This article shows how to use the pyodbc built-in functions to connect to Azure Data Lake Storage data, execute queries, and output the results.

Using the CData ODBC Drivers on a UNIX/Linux Machine

The CData ODBC Drivers are supported in various Red Hat-based and Debian-based systems, including Ubuntu, Debian, RHEL, CentOS, and Fedora. There are also several libraries and packages that are required, many of which may be installed by default, depending on your system. For more information on the supported versions of Linux operating systems and the required libraries, please refer to the "Getting Started" section in the help documentation (installed and found online).

Installing the Driver Manager

Before installing the driver, check that your system has a driver manager. For this article, you will use unixODBC, a free and open source ODBC driver manager that is widely supported.

For Debian-based systems like Ubuntu, you can install unixODBC with the APT package manager:

$ sudo apt-get install unixodbc unixodbc-dev

For systems based on Red Hat Linux, you can install unixODBC with yum or dnf:

$ sudo yum install unixODBC unixODBC-devel

The unixODBC driver manager reads information about drivers from an odbcinst.ini file and about data sources from an odbc.ini file. You can determine the location of the configuration files on your system by entering the following command into a terminal:

$ odbcinst -j

The output of the command will display the locations of the configuration files for ODBC data sources and registered ODBC drivers. User data sources can only be accessed by the user account whose home folder the odbc.ini is located in. System data sources can be accessed by all users. Below is an example of the output of this command:

DRIVERS............: /etc/odbcinst.ini SYSTEM DATA SOURCES: /etc/odbc.ini FILE DATA SOURCES..: /etc/ODBCDataSources USER DATA SOURCES..: /home/myuser/.odbc.ini SQLULEN Size.......: 8 SQLLEN Size........: 8 SQLSETPOSIROW Size.: 8

Installing the Driver

You can download the driver in standard package formats: the Debian .deb package format or the .rpm file format. Once you have downloaded the file, you can install the driver from the terminal.

The driver installer registers the driver with unixODBC and creates a system DSN, which can be used later in any tools or applications that support ODBC connectivity.

For Debian-based systems like Ubuntu, run the following command with sudo or as root: $ dpkg -i /path/to/package.deb

For Red Hat systems and other systems that support .rpms, run the following command with sudo or as root: $ rpm -i /path/to/package.rpm

Once the driver is installed, you can list the registered drivers and defined data sources using the unixODBC driver manager:

List the Registered Driver(s)

$ odbcinst -q -d CData ODBC Driver for Azure Data Lake Storage ...

List the Defined Data Source(s)

$ odbcinst -q -s CData ADLS Source ...

To use the CData ODBC Driver for Azure Data Lake Storage with unixODBC, ensure that the driver is configured to use UTF-16. To do so, edit the INI file for the driver (cdata.odbc.adls.ini), which can be found in the lib folder in the installation location (typically /opt/cdata/cdata-odbc-driver-for-adls), as follows:

cdata.odbc.adls.ini

... [Driver] DriverManagerEncoding = UTF-16

Modifying the DSN

The driver installation predefines a system DSN. You can modify the DSN by editing the system data sources file (/etc/odbc.ini) and defining the required connection properties. Additionally, you can create user-specific DSNs that will not require root access to modify in $HOME/.odbc.ini.

Authenticating to a Gen 1 DataLakeStore Account

Gen 1 uses OAuth 2.0 in Azure AD for authentication.

For this, an Active Directory web application is required. You can create one as follows:

  1. Sign in to your Azure Account through the .
  2. Select "Azure Active Directory".
  3. Select "App registrations".
  4. Select "New application registration".
  5. Provide a name and URL for the application. Select Web app for the type of application you want to create.
  6. Select "Required permissions" and change the required permissions for this app. At a minimum, "Azure Data Lake" and "Windows Azure Service Management API" are required.
  7. Select "Key" and generate a new key. Add a description, a duration, and take note of the generated key. You won't be able to see it again.

To authenticate against a Gen 1 DataLakeStore account, the following properties are required:

  • Schema: Set this to ADLSGen1.
  • Account: Set this to the name of the account.
  • OAuthClientId: Set this to the application Id of the app you created.
  • OAuthClientSecret: Set this to the key generated for the app you created.
  • TenantId: Set this to the tenant Id. See the property for more information on how to acquire this.
  • Directory: Set this to the path which will be used to store the replicated file. If not specified, the root directory will be used.

Authenticating to a Gen 2 DataLakeStore Account

To authenticate against a Gen 2 DataLakeStore account, the following properties are required:

  • Schema: Set this to ADLSGen2.
  • Account: Set this to the name of the account.
  • FileSystem: Set this to the file system which will be used for this account.
  • AccessKey: Set this to the access key which will be used to authenticate the calls to the API. See the property for more information on how to acquire this.
  • Directory: Set this to the path which will be used to store the replicated file. If not specified, the root directory will be used.

/etc/odbc.ini or $HOME/.odbc.ini

[CData ADLS Source] Driver = CData ODBC Driver for Azure Data Lake Storage Description = My Description Schema = ADLSGen2 Account = myAccount FileSystem = myFileSystem AccessKey = myAccessKey

For specific information on using these configuration files, please refer to the help documentation (installed and found online).

You can follow the procedure below to install pyodbc and start accessing Azure Data Lake Storage through Python objects.

Install pyodbc

You can use the pip utility to install the module:

pip install pyodbc

Be sure to import with the module with the following:

import pyodbc

Connect to Azure Data Lake Storage Data in Python

You can now connect with an ODBC connection string or a DSN. Below is the syntax for a connection string:

cnxn = pyodbc.connect('DRIVER={CData ODBC Driver for Azure Data Lake Storage};Schema=ADLSGen2;Account=myAccount;FileSystem=myFileSystem;AccessKey=myAccessKey;')

Below is the syntax for a DSN:

cnxn = pyodbc.connect('DSN=CData ADLS Sys;')

Execute SQL to Azure Data Lake Storage

Instantiate a Cursor and use the execute method of the Cursor class to execute any SQL statement.

cursor = cnxn.cursor()

Select

You can use fetchall, fetchone, and fetchmany to retrieve Rows returned from SELECT statements:

import pyodbc cursor = cnxn.cursor() cnxn = pyodbc.connect('DSN=CData ADLS Source;User=MyUser;Password=MyPassword') cursor.execute("SELECT FullPath, Permission FROM Resources WHERE Type = 'FILE'") rows = cursor.fetchall() for row in rows: print(row.FullPath, row.Permission)

You can provide parameterized queries in a sequence or in the argument list:

cursor.execute( "SELECT FullPath, Permission FROM Resources WHERE Type = ?", 'FILE',1)

Metadata Discovery

You can use the getinfo method to retrieve data such as information about the data source and the capabilities of the driver. The getinfo method passes through input to the ODBC SQLGetInfo method.

cnxn.getinfo(pyodbc.SQL_DATA_SOURCE_NAME)

You are now ready to build Python apps in Linux/UNIX environments with connectivity to Azure Data Lake Storage data, using the CData ODBC Driver for Azure Data Lake Storage.