Quantcast
Channel: Azure Stream Analytics Team Blog
Viewing all 71 articles
Browse latest View live

(Cross Post) Processing IIS ETW events using Azure Stream Analytics

$
0
0

Are you looking for an easy way to process Event Tracing for Windows (ETW) logs? Well, I just came across this excellent blog post by Tomas Restrepo. Here Thomas talks about how easily you can collect IIS ETW logs and send them to Event Hubs in  JSON format and use Azure Stream Analytics to get real-time insights from it.

Do you have something great to share?  Tweet us at @AzureStreaming


New Language Features: LAST function, new Array and Record functions

$
0
0

We are pleased to share several updates to the Azure Stream Analytics query language included in today’s Azure update. 

LAST function

We have added the new analytics function LAST, which enables you to retrieve the most recent event in a data stream in a given timeframe.  This function was requested by customers to enable scenarios like performing “last known good” value lookups.

For example, this query will return the last not-null sensor reading for every event. 

SELECT
    sensorId, 
    LAST(reading) OVER (PARTITION BY sensorId LIMIT DURATION(hour, 1) WHEN reading IS NOT NULL)
FROM input 

Note that the WHEN clause is used inside the OVER construct to express the condition for the events to be considered in the function. Similarly, the WHEN clause can now be used across all Analytic Functions. 

New Array functions

Previously, Azure Stream analytics enabled accessing Array data via the CROSS APPLY operator coupled with the GetElements function.

Now it is easier to access individual array elements with new functions:

GetArrayElement – Get individual element from the array field

GetArrayElements – Get all array values and indexes

GetArrayLength – Get the length of array 

New Record functions

Previously Stream Analytics allowed accessing nested field using dot notation.

In this release we added new functions to help in cases where field names are dynamic and are not known at the time of query authoring.

GetRecordPropertyValue - Returns the property value given the path  

GetRecordPropertiesGet all record properties to be used with CROSS APPLY operator

Query Pattern of the Week: Find the duration of a condition

$
0
0

To increase familiarity with the Stream Analytics Query Language, we are kicking off a blog series that will feature a new query pattern every week from Common Stream Analytics Query Patterns.  In addition to the examples and explanations found in the documentation, posts will also include a sample data file that you can use to test out the query directly in an ASA job.

This week’s pattern is below and you can read about more patterns here.  Looking to solve a problem that you don’t see captured yet?  Let us know via our msdn forum or by tweeting us @AzureStreaming.

Description: Determine the length of time that a given condition occurred

Example Scenario: Suppose you have a data stream of vehicles crossing a toll bridge.  A bug occurred causing all cars to have an incorrect weight (over 20,000 pounds) for a period of time.  Determine the duration that this bug impacted car data.

Input:

MAKETIMEWEIGHT
Honda2015-01-01T00:00:01.0000000Z2000
Toyota2015-01-01T00:00:02.0000000Z25000
Honda2015-01-01T00:00:03.0000000Z26000
Toyota2015-01-01T00:00:04.0000000Z25000
Honda2015-01-01T00:00:05.0000000Z26000
Toyota2015-01-01T00:00:06.0000000Z25000
Honda2015-01-01T00:00:07.0000000Z26000
Toyota2015-01-01T00:00:08.0000000Z2000

 

Test File:

Download from GitHub: CarWeights.json

Query

SELECT

    PrevGood.Time AS StartFault,

    ThisGood.Time AS Endfault,

    DATEDIFF(second, PrevGood.Time, ThisGood.Time)ASFaultDuraitonSeconds

FROM

    Input AS ThisGood TIMESTAMPBYTime

    INNERJOIN Input AS PrevGood TIMESTAMPBYTime

    ONDATEDIFF(second, PrevGood, ThisGood)BETWEEN 1 AND3600

    AND PrevGood.Weight <20000

    INNERJOIN Input AS Bad TIMESTAMPBYTime

    ONDATEDIFF(second, PrevGood, Bad)BETWEEN 1 AND3600

    ANDDATEDIFF(second, Bad, ThisGood)BETWEEN 1 AND3600

    AND Bad.Weight >=20000

    LEFTJOIN Input AS MidGood TIMESTAMPBYTime

    ONDATEDIFF(second, PrevGood, MidGood)BETWEEN 1 AND3600

    ANDDATEDIFF(second, MidGood, ThisGood)BETWEEN 1 AND3600

    AND MidGood.Weight <20000

WHERE

    ThisGood.Weight <20000

    AND MidGood.Weight ISNULL

 

Output:

STARTFAULTENDFAULTFAULTDURATIONSECONDS
2015-01-01T00:00:01.0000000Z2015-01-01T00:00:08.0000000Z7
2015-01-01T00:00:01.0000000Z2015-01-01T00:00:08.0000000Z7
2015-01-01T00:00:01.0000000Z2015-01-01T00:00:08.0000000Z7
2015-01-01T00:00:01.0000000Z2015-01-01T00:00:08.0000000Z7
2015-01-01T00:00:01.0000000Z2015-01-01T00:00:08.0000000Z7
2015-01-01T00:00:01.0000000Z2015-01-01T00:00:08.0000000Z7

 

Explanation: We are looking for 2 “good” events with a series of consecutive “bad” events (weight over 20,000) in between.  This is implemented using 2 JOINs and validating that we get good -> bad -> good by checking the weight and comparing the time stamps.  We can now compute the duration between the beginning and end good events which gives us the duration of the bug.

Working with complex data types in Azure Stream Analytics

$
0
0

Azure Stream Analytics supports processing events in a variety of data formats (CSV, JSON, Avro).

JSON and Avro can contain complex types such as nested objects (records) or arrays. Many Stream Analytics customers use nested data and we have received various questions related to processing events containing records and/or arrays.

This blog post provides an overview of available record and array functions and includes query examples for typical operations.

Record data type

A Record is a collection of name/value pairs. The Record data type is used to represent JSON objects and Avro records when corresponding formats are used in the input data streams.

Let’s assume we are processing data events with sensor readings in JSON format and walk through few common operations over records.

Here is example of a single event:

{

     "DeviceId" : "12345",

     "Location" : {"Lat": 47, "Long": 122 }

     "SensorReadings" :

     {

          "Temperature" : 80,

          "Humidity" : 70,

           "CustomSensor01" : 5,

           "CustomSensor02" : 99

      }

}

 

Access nested record fields

You can use dot notation to access nested fields. For example, this query selects lat/long coordinates of the device:

SELECT

    DeviceID,

    Location.Latitude,

    Location.Longitude

FROM input

 

Access nested record fields when the filed name is dynamic

If property name is unknown at query authoring time, use the GetRecordPropertyValue

For example, imagine our sample stream is joined with reference data containing thresholds for each device sensor:

SELECT

    input.DeviceID,

    thresholds.SensorName

FROM input

JOIN thresholds

ON

    input.DeviceId = thresholds.DeviceId

WHERE

    GetRecordPropertyValue(input.SensorReading, thresholds.SensorName) > thresholds.Value

 

Extract record fields as separate events

To convert record fields into separate events use CROSS APPLY operator with the GetRecordProperties function. For example, to convert our sample stream into stream of event with individual sensor readings use this query:

SELECT

    event.DeviceID,

    sensorReading.PropertyName,

    sensorReading.PropertyValue

FROM input as event

CROSS APPLY GetRecordProperties(event.SensorReadings) AS sensorReading

Array data type

An Array is an ordered collection of values. The Array data type is used to represent JSON and Avro arrays when corresponding formats are used in the input data streams.

In this section we will explain how to perform common operations over array values. Examples assume input events that have “arrayField” property of array type. 

Select array element at specified index

For example, to select the first array element use the GetArrayElement function: 

SELECT

    GetArrayElement(arrayField, 0) AS firstElement

FROM input

 

Select array length

To retrieve the length of an array, use the GetArrayLength function:

SELECT

    GetArrayLength(arrayField) AS arrayLength

FROM input

 

Select all array elements as individual events

To extract all array elements as individual events use CROSS APPLY operator with the GetArrayElements function:  

SELECT

    arrayElement.ArrayIndex,

    arrayElement.ArrayValue

FROM input as event

CROSS APPLY GetArrayElements(event.arrayField) AS arrayElement\

Query Pattern of the Week: Specify logic for different cases

$
0
0

Last week we kicked off a new blog series to highlight a query pattern with a real world example every week.  This week we examine how to use CASE statements to specify conditional logic.  For more query patterns, check out the Common Stream Analytics Query Patterns page.

Description: Evaluate one of multiple results based on a list of if/then/else conditions.

Example: Suppose you have a data stream of vehicles crossing a toll bridge.  Provide a string description of how many cars of each make have been recorded in a 10 second interval, with a special case for 1.

Input:

Make

Time

Honda

2015-01-01T00:00:01.0000000Z

Toyota

2015-01-01T00:00:02.0000000Z

Toyota

2015-01-01T00:00:03.0000000Z

 

Test File:

Download from GitHub: CarMakes.json

Query:

SELECT

    CASE

       WHENCOUNT(*)= 1 THENCONCAT('1 ', Make)

       ELSECONCAT(CAST(COUNT(*)ASNVARCHAR(MAX)),' ', Make,'s')

    ENDAS CarsPassed,

    System.TimeStampAS Time

FROM

    Input TIMESTAMPBY Time

GROUPBY

    Make,

    TumblingWindow(second, 10)

 

Output:

CarsPassed

Time

1 Honda

2015-01-01T00:00:10.0000000Z

2 Toyotas

2015-01-01T00:00:10.0000000Z

 

Explanation: The CASE statement allows us to provide a different computation based on some criteria (in our case the count of cars in the aggregate window).

(Cross Post) Refreshing reference data with Azure Data Factory for Azure Stream Analytics Jobs

$
0
0

Since we introduced the refresh capability of our Reference Data, we had been seeing a lot of asks on how the reference data could be created on a particular schedule. Some customers have used SSIS on a stand alone machine while others have written custom applications running on an Azure Virtual Machine which populates the reference data on a specific schedule. These solutions are tough to manage in a production scenario because someone has to monitor the availability. Well the wait is over! Azure Data Factory is a cloud based managed service which provides an easy way to build your Reference data on your desired schedule. Check out this blog post to learn more.

New: Support for Power BI Groups and Documentation Updates

$
0
0

As the fall begins to kick into high gear, we are excited to have another great new feature to share with you today: Support for pushing output from your Stream Analytics job to Power BI Groups, allowing the output and any associated dashboards etc. to be shared with other Power BI users.

Groups in Power BI bring you and your colleagues together to collaborate, communicate, and connect with your data across Office 365. Create a group in either Power BI or Office 365. From your Stream Analytics job you can now select to send output to a specific group within your Power BI account or choose to simply continue to write to "My Workspace". Then invite co-workers into your group workspace where you can collaborate on shared dashboards, reports, and datasets. For more information on Groups in Power BI check out the Power BI Groups documentation.

In addition to the release of Power BI Groups support, we are also releasing a number of updates to our Stream Analytics documentation addressing customer feedback and making it easier to get started learning about the product. To learn more check out the new Stream Analytics Learning Map on Azure.com.

 

Azure Stream Analytics @ The First Ever Cortana Analytics Workshop!

$
0
0

The Azure Stream Analytics (ASA) team was proud to be a part of the first ever Cortana Analytics Workshop that was held September 10-11, 2015. The 2-day event was sold out, we had around 700 attendees, around 500 customers/partners from 293 companies and 35 countries (almost 1 in 5 attendees were non-US). For those newly hearing about Cortana Analytics, below is a picture that summarizes the various components in the Suite.  Azure Stream Analytics is a key component playing a role in the Analytics stack to provide real-time stream processing optimized for low latency, high throughput, resilient workloads.

 

The tutorial session that the team hosted for ASA had a full-house with around 140 participants putting the infrastructure we had setup for the lab to test at scale!  The tutorial titled “Unlocking Real-Time Insights for Your IoT Data” walked participants through gaining real-time insights on device data ingested through Azure Event Hubs and processed using ASA.  Below are some highlights from the session and the workshop. You can try the tutorial at your leisure by accessing step-by-step instructions here

To learn more about Azure Stream Analytics, access our new Stream Analytics Learning Map on Azure Stream Analytics documentation page!.

 

Keynote Highlights:

  • Joseph Sirosh, Corporate Vice President of Information Management & Machine Learning kicked-off the workshop with opening keynote on the “Future of Analytics”. He talked about the intelligent cloud, how it’s eating software + data, the power of transforming data into intelligence & actions, Microsoft’s vision for analytics that is Agile, Simple & Beautiful, and how Cortana Analytics Suite (CAS) puts us on that path.
  • Jason Wilcox, Partner Director of Information Management had a great session on “Demystifying Cortana Analytics” – he talked about the kinds of questions companies are asking of their data, the [convoluted] processes for getting those questions answered today, and how CAS can help streamline this process, end-to-end.
  • James Phillips, Corporate Vice President of Power BI had a very polished closing keynote for the day, on “Power BI 2.0” in the context of the data platform. He talked about how the software connecting businesses to customers has gotten progressively strengthened over the years – first by the web, then mobile, and now IoT. Along the way, tonsmore data are generated, with an unprecedented “data dividend” awaiting companies that embrace this change / opportunity. Power BI is the single pane of glass to make sense of this data and gain deeper insights. And Microsoft, with CAS, is the only “full coverage vendor” with solutions for all classes of data – transactional, big data, streaming – be it on-prem or cloud, i.e. a “single throat to choke”. No other vendor gives you this.

 

 For latest news on workshop and other updates visit Joseph Sirosh’s blog.


Query Pattern of the Week: Send data to multiple outputs

$
0
0

Have you checked out our team’s collection of Common Stream Analytics Query Patterns? This location acts as a repository for query patterns commonly used by our customers. One pattern that frequently comes up in real-world applications is directing job data to multiple outputs to enable both a hot path and a cold path for data.  Read on for more details!

Description: In a single job, send processed data to multiple outputs

Example: Analyze a stream of vehicle data and alert over a given condition, while archiving all events to long term storage.

Input:

Make

Time

Honda

2015-01-01T00:00:01.0000000Z

Honda

2015-01-01T00:00:02.0000000Z

Toyota

2015-01-01T00:00:01.0000000Z

Toyota

2015-01-01T00:00:02.0000000Z

Toyota

2015-01-01T00:00:03.0000000Z

Test File:

Download from GitHub: MultipleOutputs.json

Query:

SELECT

    *

INTO

    ArchiveOutput

FROM

    Input TIMESTAMPBY Time

 

 

SELECT

    Make,

    System.TimeStampAS Time,

    COUNT(*)AS[Count]

INTO

    AlertOutput

FROM

    Input TIMESTAMPBYTime

GROUPBY

    Make,

    TumblingWindow(second, 10)

HAVING                

    [Count] >= 3

 

Outputs:

ArchiveOutput:

Make

Time

Honda

2015-01-01T00:00:01.0000000Z

Honda

2015-01-01T00:00:02.0000000Z

Toyota

2015-01-01T00:00:01.0000000Z

Toyota

2015-01-01T00:00:02.0000000Z

Toyota

2015-01-01T00:00:03.0000000Z

AlertOutput:

Make

Time

Count

Toyota

2015-01-01T00:00:10.0000000Z

3

Explanation:

The INTO clause tells Stream Analytics which of the outputs to direct data to.  The first query is a pass-through of all data received by the job. The second query does some simple aggregation and filtering and could potentially send results to a downstream alerting system.

Looking for more examples?

To learn about more patterns, check out the Common Stream Analytics Query Patterns page.  Looking to solve a problem that you don’t see captured yet?  Let us know via our msdn forum or by tweeting us @AzureStreaming.

Stream Analytics updates for the Azure IoT Suite

$
0
0

Today Microsoft announced the availability of the Azure IoT Suite, a collection of preconfigured solutions that enable you to easily develop, deploy, and scale your Internet of Things solutions.

Stream Analytics is a core service in the IoT Suite and as part of this announcement, we have delivered several new features today.  Included in today’s updates are presence in the Azure preview portal, support for DocumentDB output, and support for IoT Hub input.  For details, see Santosh Balasubramanian’s blog post.  The full announcement of the IoT Suite can also be found here.

Query Pattern of the Week: Use expressions inside a TIMESTAMP BY clause

$
0
0

Azure Stream Analytics allows expressing complex event processing rules using a simple SQL-like query language. Given the temporal nature of Stream Analytics queries, it is important to specify a timestamp for every input event.  By default, Stream Analytics will use arrival time of the input event – e.g. if Event Hub is used as an input, the timestamp will be the time when the event was received by the Event Hub.

For many streaming applications it is important to use the exact time when an event occurred. For these cases, the TIMESTAMP BY clause can be used to specify custom timestamp.  For example, if we want to use column “time” as event timestamp, the query will look like:

SELECT *

FROM input TIMESTAMP BY [time]

Until recently, you could only use field names within the TIMESTAMP BY clause. Some customers have asked for more flexibility – for example one may want to adjust time from local time to UTC, specify conversion from different time format (like UNIX epoch time), etc.  

With the most recent update of Stream Analytics, you can use any expression of type DATETIME within the TIMESTAMP BY clause.

Example: Imagine we receive data from two different type of sensors. One type of sensor uses “time” as the name of the timestamp field, while the second type of sensor is using the “readtime” field.

Input:

sensorid

value

time

readtime

s1

70

2015-09-29T17:48:42

 

s2

55

 

2015-09-29T17:48:49

s1

78

2015-09-29T17:48:53

 

s2

45

 

2015-09-29T17:48:57

s1

82

2015-09-29T17:49:02

 

s1

73

2015-09-29T17:49:07

 

s2

51

 

2015-09-29T17:49:11

s2

48

 

2015-09-29T17:49:15

 

Test File: Download from GitHub: DifferentTimestamps.json

Query:

SELECT [sensorid], AVG([value])

FROM input TIMESTAMP BY

    CASE WHEN [time] is NOT NULL

        THEN [time]

        ELSE [readTime]

        END     

GROUP BY [sensorid], TUMBLINGWINDOW(hour, 1)

Output:

sensorid

avg

s1

75.75

S2

49.75

Explanation:

By adding an expression inside TIMESTAMP BY clause, we can use the same query to handle both input formats and produce average values for both types of sensors:

Looking for more examples?

To learn about more patterns, check out the Common Stream Analytics Query Patterns page.  Looking to solve a problem that you don’t see captured yet?  Let us know via our msdn forum or by tweeting us@AzureStreaming.

(Cross Post) Azure Stream Analytics and DocumentDB for your IoT application

$
0
0

Support for output to Azure DocumentDB from Stream Analytics jobs has been highly requested by customers and was the top-voted idea on the Azure Feedback Forum.  Stream Analytics and DocumentDB are both core services in the Azure IoT Suite and have recently been updated to include first-class integration with one another. 

Today the integration between these two services was featured on the Azure blog.  You can view the full post here.

Sending and consuming events in Avro format

$
0
0

Azure Stream Analytics currently supports three formats for input event serialization: Avro, CSV and JSON. This blog post will demonstrate how to send events to an input source in the Avro format, to be later consumed by a Stream Analytics job
For examples below, assume that we are sending events to an Event Hub instance.  

Let’s start by defining the events that would be sent to input source.

[DataContract]

public class SampleEvent

{

[DataMember]

public int Id { get; set; }

 }

We will be using “Microsoft.Hadoop.Avro” library for Avro serialization. You will need to add a nuget reference for this library through Project -> Manage Nuget Packages. 

 

 Here is how Packages.config file looks like after adding nuget reference. I have added packages for Service bus as well.

 <?xmlversion="1.0"encoding="utf-8"?>

<packages>

  <packageid="Microsoft.Hadoop.Avro"version="1.5.6"targetFramework="net45" />

  <packageid="Microsoft.WindowsAzure.ConfigurationManager"version="3.1.0"targetFramework="net45" />

  <packageid="Newtonsoft.Json"version="6.0.4"targetFramework="net45" />

  <packageid="WindowsAzure.ServiceBus"version="3.0.6"targetFramework="net45" />

 </packages>

 

Now let’s look at the serialization code. We will be using classes from Microsoft.Hadoop.Azure namespace. Stream Analytics expects the events to be serialized sequentially in an Avro container.

 private class AvroEventSerializer<T>

        {

            private IAvroSerializer<T> avroSerializer;

            public AvroEventSerializer()

            {

                this.avroSerializer = AvroSerializer.Create<T>();

            }

            public byte[] GetSerializedEvents<T>(IEnumerable<T> events)

            {

                if (events == null || !events.Any())

                {

                    return null;

                }

                using (var memoryStream = new MemoryStream())

                using (var avroWriter = AvroContainer.CreateWriter<T>(memoryStream, Codec.Null))

                using (var sequentialWriter = newSequentialWriter<T>(avroWriter, events.Count()))

                {

                    foreach (var e in events)

                    {

                        sequentialWriter.Write(e);

                    }

                    return memoryStream.ToArray();

                }

            }

        }

Above code serializes the events in Avro format, and also includes the schema in each payload. Azure Stream Analytics requires schema to be specified with the payload. Note that the container has multiple events and schema is specified only once.

Finally, let’s send events to Event Hub using above code:

private static void Main(string[] args)

        {

            var eventHubClient =

                EventHubClient.CreateFromConnectionString(

                    "<ReplaceWithServiceBusConnectionString>",

                    "<ReplaceWithEventHubName>");

            var avroEventSerializer = new AvroEventSerializer<SampleEvent>();

            while (true)

            {

                var eventsPayload =

                    avroEventSerializer.GetSerializedEvents(

                        Enumerable.Range(0, 5).Select(i => newSampleEvent() { Id = i }));

                eventHubClient.Send(newEventData(eventsPayload));

                Thread.Sleep(TimeSpan.FromSeconds(10));

            }

        }

We have seen example code for sending events in Avro format to Event Hub. These events can now be consumed by a Stream analytics job by configuring an eventhub input and selecting Avro format. 

[DataContract]

public class SampleEvent

{

[DataMember]

public int Id { get; set; }

}

VMob's cool retail sales boosting IoT solution

$
0
0

We are very excited to share an interesting solution built by one of our customers VMob . New Zealand-based VMob is harnessing IoT to help McDonald’s transform its customer engagement in the Netherlands, Sweden and Japan, regions that represent about 12 percent of the food service retailer’s locations worldwide. With VMob, McDonald expanded its existing mobile app in these markets, building on standard features such as product information, restaurant locator and mass offers for promotions and specials. They did this by combining the mobile app with contextual information and social engagement to dynamically personalize the customer experience.

­

Watch this cool video to see the solution in action:https://www.youtube.com/watch?v=EsT04Uopl7o&feature=youtu.be

Detailed article can be found here: http://blogs.microsoft.com/iot/2015/01/12/boosting-retail-sales-with-iot-powered-customer-engagement/

How do I transform incoming events using Stream Analytics?

$
0
0

Question: The data that arrives from the devices will have the format Time, Id, P1, P2, P3. Can this be transformed using SA to the other/vertical format and stored to SQL? This interesting question was asked by one of our users recently and our good friend Paolo Salvatori has put up a well written solution on his blog. Thanks Paolo for the quick write-up.

Here is the example:

Instead of having a horizontal structure on the tables, e.g.

Time, Id, P1, P2, P3

21-01-2016 12:23:22, XYZ, 3, 5, 7


We are thinking of a horizontal structure, e.g.

Time, Id, Type, Value

21-01-2016 12:23:22, XYZ, P1, 3

21-01-2016 12:23:22, XYZ, P2, 5

21-01-2016 12:23:22, XYZ, P3, 7

The reason for this is that the different devices sends different data and we can accommodate for this.


Tumbling, Sliding and Hopping Windows with Azure Streaming

Stream Analytics query to check when no data has arrived from a given device in a configurable time window

$
0
0

Paolo Salvatori describes well here how you might write a stream analytics query to check when no data has arrived from a given device in a configurable time window. To solve this problem, the idea is to correlate the data stream containing real-time events (e.g. sensor readings) with device reference data. Thanks again Paolo for a nicely written solution.

Querying JSON array with Azure Stream Analytics

$
0
0

Kent Weare has written a nice post on querying Json array with Azure Stream Analytics. He is getting device reads off of an Azure Event Hub. These reads are being aggregated on the publisher side and placed into a single message/event. Since the publisher is creating a message structure that contains many device reads for that specific interval Kent wanted to ensure he can process each element in the array within Stream Analytics query.

Hello "Toll" - Toll Booth tutorial to ramp you up on Azure Stream Analytics

$
0
0

A tolling station is a common phenomenon – we encounter them in many expressways, bridges, and tunnels across the world. Each toll station has multiple toll booths, which may be manual – meaning that you stop to pay the toll to an attendant, or automated – where a sensor placed on top of the booth scans an RFID card affixed to the windshield of your vehicle as you pass the toll booth. It is easy to visualize the passage of vehicles through these toll stations as an event stream over which interesting operations can be performed.

We recently updated our "Hello World" like tutorial for Azure Stream Analytics. It is posted here

Please try it out and let us know some feedback. What did you like? Was it easy to follow the tutorial? Would like us to post more of such Hello World examples? We would love to hear from you!

Notify users of data received from sensors or other systems

$
0
0

How many times have you gone to a vending machine, just to find that the one snack bar you wanted out of the 50,000 things in the machine has run out?  If you are really hungry you could kick the machine, but well that wouldn’t help. Say you write a note to the vending machine company. Let’s see what the vending machine company can do with this information today. Joe, our vending machine owner looks at the note and thinks of ways to reduce this customer dissatisfaction. He could replace additional slots with this snack bar. But wait, what if we at Azure could help Joe. Spyros Sakellariadis explains in this blog how to use Azure Stream Analytics to notify users of data received from sensors or other systems. Imagine remote telemetry from this vending machine is sent to the Azure. Using stream analytics you can continuously monitor the inventory you had added to the vending machine, information of purchases or money received by vending machine you are receiving via telemetry and current state of the inventory in the vending machine. Now only if you could write up a neat query and send a push notification to the nearest route driver as soon as you know the snack bar is about to run out. Thanks Spyros for this nicely written post.

Viewing all 71 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>