- Published on
Repeatedly importing API data into Dataverse with Power Automate
- Authors
- Name
- Tom Senior
Overview
Sometimes we are going to want to import data from an API repeatedly whether its once a week, daily, or even every hour. This could be to fetch updated data and add it to the Microsoft Dataverse table or possibly to recreate the table all together with new data.
For this project, i needed to fetch data from an API endpoint once a day and push that into a Microsoft Dataverse table, adding it to existing records. This method will also work if you want to remove any pre-existing data in the table (kill and fill) by adding a couple of steps, which i'll go through.
To do this we'll need to use a Power Automate cloud flow and its recurrence trigger. This flow will also require the premium HTTP connector, so you may need a Power Automate licence if you don't already have one.
Build the Dataverse Table
First we'll need a Dataverse table to put our data in. So head to make.powerapps.com/entities and create a new table. In this table create the columns that you're fetching data for. In this example we're fetching beverages from punkapi.com.
Build the Power Automate Flow
Create a new cloud flow in Power Automate, and the trigger will be Recurrence. Here you need to select how frequent you want the flow to run. If you show the advanced settings you can select the timezone and time it runs. For this i've selected once a day at 5:00am in my timezone.
Next we'll make the API request to get the JSON data, so add a new step and select the HTTP operation. Here input your HTTP method and your API endpoint URI. If you need authentication for the API endpoint you will need to configure that here, usually in the header of the request.
Now we want to parse the JSON response that we get from the API, so hit new step and add a Parse JSON operation. In the Content field we want to pass the HTTP request Body
object.
For the schema the best option is manually make the API request (via web browser or Postman), hit Generate from sample
and paste the JSON response in to generate JSON schema from the sample.
This is where i see a lot of schema errors that stop the flow running. Sometimes its best to remove any values from the JSON schema you are not using. To do this copy the JSON schema from Power Automate, paste it into a VS Code session and start remove the values you don't need. Then paste it back to Power Automate. The reason for using VS Code is to check for any errors in the Schema.
So fetching and parsing the data is now done, lets push this into the Dataverse table. Add a new step and select the Add a new row Microsoft Dataverse operation. Select the table you created earlier and click Show advanced options. Now we can fill in all the data we need from the Parsed JSON we created, so use the dynamic content to map the API data to the correct fields in Dataverse. When you add the first row item you'll notice the operation drop into a Apply to each
loop.
Thats all we need if you want to keep adding data to the same table. Go ahead and test the flow and it will populate your table.
If you need to remove the old data before adding the new follow the below steps.
Removing Old Data
Sometimes we want to go for the kill and fill strategy (poor phil). That means clearing out the data in the table, then re-populating it with new data. So follow the previous steps then add these operations after the trigger.
So between Recurrence and HTTP add a new Dataverse action called List rows and select the table we've created.
Then below that add another new Microsoft Dataverse action called Delete a row. Again select the table we are working with and then for Row ID
use dynamic content and find the item with description Unique identifier for entity instances
. Its not clear from the name, but this means the unique ID for the row.
You'll notice it pops into an Apply to each
loop again.
That's it! Now when we run our flow, it deletes all the rows in the Dataverse table first, then it populates them with the new API request data.
Thanks for reading. Any questions or feedback - Twitter, LinkedIn or Email