Click here to Skip to main content
15,868,016 members
Articles / Internet of Things

Azure Function as Output Job Topology of an Azure Stream Analytics Job

Rate me:
Please Sign up or sign in to vote.
5.00/5 (4 votes)
19 Dec 2018CPOL6 min read 10.4K   3   2
In this article, we are going to see how we can set up an Azure Function as an Output job topology of an Azure Stream Analytics job. Doesn’t that sound interesting?

Image 1

Azure IoT Dev Kit

Introduction

For the last few days, I have been playing with my Azure IoT Dev Kit MXChip. In this article, we are going to see how we can set up an Azure Function as an Output job topology of an Azure Stream Analytics job. Doesn’t that sound interesting? In our previous articles, we have already seen what is an Azure Stream Analytics Job and How we can create it by using the portal and Visual Studio. If you haven’t read those articles, I strongly recommend you to read them. Let’s jump on to this article now.

Background

As I mentioned earlier, in this article, we will be:

  1. using our existing Azure Stream Analytics job
  2. creating a new Azure Function App
  3. setting up the newly created Azure function as an output job topology of the stream analytics job
  4. monitoring the data coming to the Azure Function from the stream analytics job.

Play with Azure Function

Yeah, we are going to play with it. Let’s go and create one then.

Creating an Azure Function

To create an Azure Function application, you need to login to your Azure portal and click on the Create a resource icon, and then you can search for the “Function App”.

In the next screen, provide the following information:

  1. App Name
  2. Subscription
  3. Resource Group
  4. OS
  5. Hosting plan
  6. Location
  7. Runtime stack
  8. Storage
  9. Application Insights

Here, the consumption plan hosting plan allows you to pay per execution, and the App service plan allows you to have a predefined capacity. For the runtime stack, we will use .NET, however, you are free to use anything you wish.

Once you have created the same, you should be able to see it under the Function Apps section.

Creating an Azure Function Solution and Function

Now let’s go to our Visual Studio and create a new solution for our Azure Function.

Image 2

Azure Function App Project Type

Now you can right click on your newly created project and add a new HttpTrigger Function. We will keep the Access Rights to Anonymous for now. I have named my function as “GetData”. For now, let’s just get the data from our Stream Analytics job and just check the length.

C#
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.Extensions.Logging;
using Newtonsoft.Json;
using System.Net;
using System.Net.Http;
using System.Threading.Tasks;namespace ml.IoTPlatform.AzureFunctions
{
    public static class GetData
    {
        [FunctionName("GetData")]
        public static async Task<HttpResponseMessage> Run(
            [HttpTrigger(AuthorizationLevel.Anonymous, "post")]HttpRequestMessage req,
            ILogger log)
        {
            log.LogInformation($"GetData function triggered with Uri {req.RequestUri}");         
            string content = await req.Content.ReadAsStringAsync();
            log.LogInformation($"String content is {content}");
            dynamic data = JsonConvert.DeserializeObject(content);
            log.LogInformation($"Data count is {data?.Count}");
            if (data?.ToString()?.Length > 262144)
            {
                return new HttpResponseMessage(HttpStatusCode.RequestEntityTooLarge);
            }
            return req.CreateResponse(HttpStatusCode.OK, "Success");
        }
    }
}

As you can see, we are not doing anything much for now, we are just receiving the data as HttpRequestMessage and we are reading the content as req.Content.ReadAsStringAsync() and then deserialize the object. If you are not doing this step, you may get an error as “No MediaTypeFormatter is available to read an object of type ‘Object’ from content with media type ‘application/octet-stream’.

We also check the entity length, and if it is too large, we send a HttpResponseMessage with status code 413.

Publish the Azure Function App

To publish your Azure Function app, just right click on your project and click Publish and then set up your publish target by choosing the existing Azure Function App, remember we have created one earlier? Once you publish the same, you can go into your Function App and see your Function. You can also test the same with some dummy data.

There are probabilities to get an error as “Web Deploy cannot modify the file on the destination because it is locked by an external process” when you try to publish your Function App from Visual Studio, while your Function App is running, to fix this, you can see my answer here.

Image 3

Function App in Portal

Azure Stream Analytics Job

Let’s go back to our Azure Stream Analytics now as we have already configured our Azure Function App successfully.

Configure Azure Function Output

In my previous article, we had created an Azure Stream Analytics job solution using Visual Studio, let’s open that solution now and configure the new output for Azure Function.

Image 4

Solution Explorer

While configuring the Azure Function Output, please make sure that you are selecting the existing Azure function app.

Image 5

Azure Function Output Configuration

Update the Script

We should also make some changes in our Script.asaql file to support our newly created output.

SQL
WITH BasicOutput AS 
(
SELECT    
    messageId,
    deviceId,
    temperature,
    humidity,
    pressure,
    pointInfo,
    IoTHub,
    MAX(EventEnqueuedUtcTime) AS EventEnqueuedUtcTime,
    EventProcessedUtcTime,
    PartitionId    
FROM
    Input TIMESTAMP By EventEnqueuedUtcTime
    GROUP BY TUMBLINGWINDOW(second, 10), 
    messageId, 
    deviceId,
    temperature,
    humidity,
    pressure,
    pointInfo,
    IoTHub,
    EventEnqueuedUtcTime,
    EventProcessedUtcTime,
    PartitionId
)
SELECT * INTO SQLServerOutput FROM BasicOutput
SELECT * INTO AzureFunctionOutput FROM BasicOutput

Updating the TLS Version

Once that is done, just click the button Submit to Azure, if you have any doubts in this section, read my previous posts on this topic. Now let’s log in to the portal again and see all the outputs, inputs, and the query has been published or not.

Image 6

Outputs in Portal

Cool! Well done, it seems like it is published. Now if you click on the AzureFunctionOutput, you may get a warning as “Please make sure that the Minimum TLS version is set to 1.0 on your Azure Functions before you start your ASA job”. I would rather treat this as an error instead of a warning because without making these changes, our Azure Stream Analytics job will not write to our Azure Function. So this is very important, I spent many hours in this and finally found this was the root cause of my issue, you can see my answer about this here.

So just go to your Azure Function App and click on Platform Features -> SSL -> Minimum TLS Version

Image 7

Setting TLS Version

There is a saying that developers don’t care about warning but only the errors, in some cases it is true. Hmm, I was just kidding.

Output

Once you are done everything mentioned, you are good to go and start your Stream Analytics job, please make sure that your MXChip is connected to a power source so that the device can start sending the data.

Checking the SQL Server Output

Now let’s log in to our SQL Server Database and run the below query to make sure that we are getting the data from the device.

SQL
SELECT TOP (1000) [Id]
      ,[messageId]
      ,[deviceId]
      ,[temperature]
      ,[humidity]
      ,[pressure]
      ,[pointInfo]
      ,[IoTHub]
      ,[EventEnqueuedUtcTime]
      ,[EventProcessedUtcTime]
      ,[PartitionId]
  FROM [dbo].[StreamData] order by [EventEnqueuedUtcTime] desc

Image 8

SQL Server Output Data

Checking Azure Function Output

To check the Azure Function Output, we can go back to our Azure Function and click on the Function and use the Monitor option.

Image 9

Azure Function Output Data

Please note that you can always check your Azure Stream Analytics job Activity Log if you found something is not working.

Conclusion

In this article, we learned how to:

  1. work with an Azure Stream Analytics job
  2. create an Azure Function App
  3. create Azure Function App solution in Visual Studio
  4. write an HttpTrigger function and publish the same to the Azure Function App
  5. set up the Azure Function App as an output job topology of Azure Stream Analytics job
  6. use the created package in another solution

In our next article, we will see how you can send this Azure Function Output data to an Azure SignalR service and then get the same data in an Angular Application. I can’t wait to write my next article.

Your Turn. What Do You Think?

Thanks a lot for reading. I will come back with another post on the same topic very soon. Did I miss anything that you may think which is needed? Could you find this post useful? Kindly do not forget to share me your feedback.

You can always see my IoT articles here.

License

This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)


Written By
Software Developer
Germany Germany
I am Sibeesh Venu, an engineer by profession and writer by passion. I’m neither an expert nor a guru. I have been awarded Microsoft MVP 3 times, C# Corner MVP 5 times, DZone MVB. I always love to learn new technologies, and I strongly believe that the one who stops learning is old.

My Blog: Sibeesh Passion
My Website: Sibeesh Venu

Comments and Discussions

 
SuggestionMessage Closed Pin
27-Feb-21 22:37
Member 1508594027-Feb-21 22:37 
QuestionStreaming analytics job and input are not showing on Azure portal and Power BI Pin
gcogco1010-Oct-19 4:18
gcogco1010-Oct-19 4:18 

General General    News News    Suggestion Suggestion    Question Question    Bug Bug    Answer Answer    Joke Joke    Praise Praise    Rant Rant    Admin Admin   

Use Ctrl+Left/Right to switch messages, Ctrl+Up/Down to switch threads, Ctrl+Shift+Left/Right to switch pages.