Skip to content Skip to sidebar Skip to footer

Writing Log With Python Logging Module In Databricks To Azure Datalake Not Working

I'm trying to write my own log files to Azure Datalake Gen 2 in a Python-Notebook within Databricks. I'm trying to achieve that by using the Python logging module. Unfortunately I

Solution 1:

You can use azure_storage_logging handler:

import logging
from azure_storage_logging.handlers importBlobStorageRotatingFileHandlerlog= logging.getLogger('service_logger')
azure_blob_handler = BlobStorageRotatingFileHandler(filename, 
                                                    account_name,
                                                    account_key,
                                                    maxBytes,
                                                    container)
log.addHandler(azure_blob_handler)

Solution 2:

Let me explain the steps for accessing or performing Write operations on Azure data lake storage using python

1) Register an application in Azure AD

enter image description here

enter image description here

2) Grant permission in data lake for the application you have registered

enter image description here

enter image description here

enter image description here

enter image description here

3) Please get the client secret from azure AD for the application you have registered.

4) You need to write a code to mount the directory in Azure data lake like below

dbutils.fs.mkdirs("/mnt/mountdatalake")

config = {"dfs.adls.oauth2.access.token.provider.type": "ClientCredential",
           "dfs.adls.oauth2.client.id": "Registered_Client_Id_From_Azure_Portal",
             "dfs.adls.oauth2.credential": "Cleint_Secret_Obtained_By_Azure_Portal",
               "dfs.adls.oauth2.refresh.url":"https://login.microsoftonline.com/Your_Directory_ID/oauth2/token"}

dbutils.fs.amount(
               source="adl://mydata.azuredatalakestore.net/mountdatabricks",
               mount_point ="/mnt/mountdatalake",
extra_configs=configs)

Once the configuration/mounting is done using application client credential, you are good to access the directory and log it.

for example , Below I have extracted couple of records from SQL server and stored it in azure data lake

enter image description here

Hope this helps.

Post a Comment for "Writing Log With Python Logging Module In Databricks To Azure Datalake Not Working"