1. Knowledge base
  2. API Query Manager

Insert your API Query Manager data into Azure SQL Database

In this article we explain how to Insert Your API Query Manager Data into Azure SQL Database

With Dataslayer you can pull a URL directly from our API QM product in JSON format (learn how here). Each URL corresponds to a data table.

To enter the information of each URL that you generate with us to Azure SQL Database, it is as easy as having a lambda function configured that calls as many URLs as you generate and need, converts the JSON format to CSV if you wish, and finally, upload it to your Azure SQL Database. All this can be done by the lambda function. 


  • An active Azure account.
  • An Azure SQL Database set up.
  • Python environment for lambda function (this is our example).
  • Dataslayer API Query Manager account.

Step 1: Prepare Your Azure SQL Database

Ensure your Azure SQL Database is configured to accept connections. Set up the necessary firewall rules and obtain the connection string, which you will need to connect your lambda function to your database.

Step 2: Create a Lambda Function 

The lambda function will fetch data from Dataslayer's API Query Manager and insert it into your Azure SQL Database. Below is an example lambda function written in Python. This function:

  1. Retrieves data from the API Query Manager.
  2. Connects to your Azure SQL Database.
  3. Inserts the data into a specified table.
import pyodbc
import requests

def lambda_handler(event, context):
# Dataslayer API URL
api_url = "Your_Dataslayer_API_URL_here"

# Fetching data from Dataslayer
response = requests.get(api_url)
data = response.json()

# Database connection parameters
server = 'your_server.database.windows.net'
database = 'your_database'
username = 'your_username'
password = 'your_password'
driver= '{ODBC Driver 17 for SQL Server}'

# Connection string
cnxn = pyodbc.connect('DRIVER='+driver+';SERVER='+server+';PORT=1433;DATABASE='+database+';UID='+username+';PWD='+ password)
cursor = cnxn.cursor()

# Insert data into SQL Database
for item in data:
# Assuming 'data' is a list of dictionaries and your table columns match the dictionary keys
columns = ', '.join(item.keys())
placeholders = ', '.join(['?'] * len(item))
sql = "INSERT INTO YourTableName ({}) VALUES ({})".format(columns, placeholders)
cursor.execute(sql, list(item.values()))


return {"status": "Data inserted into Azure SQL Database"}

Step 3: Deploy Your Lambda Function 

Deploy this lambda function within your cloud environment or server that can execute Python code. Ensure it has access to the internet to fetch data from Dataslayer's API and to your Azure SQL Database.

Step 4: Schedule or Trigger Your Lambda Function

Based on your requirements, you can schedule the lambda function to run at specific intervals or trigger it based on certain events. This ensures your Azure SQL Database is regularly updated with the latest data from Dataslayer's API Query Manager.


By following these steps, you can automate the process of inserting data from Dataslayer's API Query Manager into your Azure SQL Database. This setup allows for real-time analytics and reporting capabilities, leveraging the power of Azure's cloud services and the convenience of Dataslayer's API Query Manager.

As always, please contact us via our live chat on our website or via email if you still have doubts or questions. We are happy to help!