1. Knowledge base
  2. API Query Manager

Insert your API Query Manager data into Azure Data Factory

We understand that navigating data integration can be challenging. That's why we're here to help you through the process of inserting data from your API Query Manager into Azure Data Factory.

To integrate your Dataslayer data into Azure Data Factory without a direct integration but utilizing the Dataslayer API, which provides a URL in JSON format, follow these steps:

1. Generate API URL

  1. Log into your Dataslayer account and navigate to the API Query Manager product.

  2. Create or select an existing API query that you wish to use for integration.

  3. Configure your query by selecting the data points, metrics, and date ranges you need.

  4. Generate the API URL, which will output data in JSON format. Ensure you have the correct permissions and authentication methods in place for accessing this URL.

FAQ Morpheus - Start Trial-1

2. Set up Azure Data Factory

  1. Open Azure Portal and navigate to your Azure Data Factory instance.

  2. Create a new pipeline within your Data Factory's authoring tool.

  3. Within the pipeline, add a 'Copy Data' activity. This activity will be responsible for copying data from the Dataslayer API QM to a destination in Azure.

3. Configure source dataset (Dataslayer API)

  1. In the 'Copy Data' activity, configure the source dataset to use the HTTP connector.

  2. Create a new HTTP linked service if you haven't already, which will be used to connect to the Dataslayer API. Input the API URL generated from Dataslayer as the request URL.

  3. Specify additional settings for authentication if required by your Dataslayer API setup.

  4. Choose the output format as JSON, matching the Dataslayer API output.

4. Configure destination dataset

  1. Choose or create a new dataset in Azure Data Factory as the destination for your data. This could be Azure Blob Storage, Azure SQL Database, or another supported service.

  2. Configure the linked service for the destination, providing necessary connection details.

  3. Map the source data from the Dataslayer API to the destination fields. This may involve creating a data mapping in the 'Copy Data' activity settings.

5. Schedule and monitor

  1. Schedule the pipeline to run at your preferred intervals. This can be set up within the trigger settings in Azure Data Factory.

  2. Monitor the pipeline runs within Azure Data Factory's monitoring tools. Check for any errors and ensure data is being transferred as expected.

6. Data transformation (optional)

If you need to transform or process the data before it lands in the destination, consider adding a Data Flow activity in Azure Data Factory before the 'Copy Data' activity. This allows for transformations like filtering, aggregation, and more complex operations.

7. Security and compliance

Ensure that all connections and data transfers are compliant with your organization's security and privacy policies. This may involve configuring secure connections, managing access control, and auditing data movements.


By following these steps, you should be able to integrate your API QM data into Azure Data Factory, enabling automation and integration of your analytics workflows.

If you're encountering an error stating "This URL is not allowed on Azure integrate runtime" when trying to integrate Dataslayer data via API QM into Azure Data Factory, this typically indicates a problem with the network security configuration in Azure Data Factory.


Here's how you can address this issue:

1. Check the Integration Runtime configuration

Ensure that the Integration Runtime used in Azure Data Factory has access to external resources. If you're using Azure's managed Integration Runtime, it should, by default, be able to access public endpoints.

2. Modify Azure Data Factory's network security settings

  1. Open Azure Data Factory in the Azure Portal.

  2. Navigate to Manage > Integration runtimes.

  3. Click on the Integration Runtime you are using.

  4. Check if there's any network restriction that could be blocking access to the Dataslayer API URL. If you're using a self-hosted Integration Runtime, ensure it's configured to allow outbound HTTP/HTTPS calls to the internet or specifically to the Dataslayer API domain.

3. Configure Azure Virtual Network (VNet) Integration (if applicable)

If your Azure Data Factory is configured to use a VNet, you may need to adjust the network security group (NSG) rules to allow outbound connections to the Dataslayer API URL. This involves:

    1. Navigating to the Virtual Network in the Azure Portal.

    2. Selecting the Subnet used by Azure Data Factory.

    3. Modifying the NSG (Network Security Group) attached to the subnet to allow outbound connections to the Dataslayer API URL.

4. Add the Dataslayer API URL to Allowed List

If you have strict network security policies or are using Azure Private Link, you might need to explicitly add the Dataslayer API URL to an allowed list or configure a service endpoint that permits access to that URL.

5. Test connectivity

After making the necessary changes, test the connectivity to the Dataslayer API URL from the Azure Data Factory environment. You might need to use custom activities or scripts in your pipeline to verify that the connection can be established successfully.

6. Contact support

If you've gone through these steps and still encounter issues, it might be helpful to contact Azure support for more specific guidance. There could be other underlying network configurations or restrictions in place that are not immediately apparent.


Adjusting these settings can resolve the error and allow Azure Data Factory to access the Dataslayer API URL successfully.

As always, please contact us via our live chat on our website or via email if you still have doubts or questions. We are happy to help!