<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:media="http://search.yahoo.com/mrss/"><channel><title><![CDATA[Luuk Mager]]></title><description><![CDATA[Cloud Solution Architect | Developer | Blogger]]></description><link>https://www.luukmager.com/</link><generator>Ghost 1.24</generator><lastBuildDate>Fri, 17 Apr 2026 23:40:38 GMT</lastBuildDate><atom:link href="https://www.luukmager.com/rss/" rel="self" type="application/rss+xml"/><ttl>60</ttl><item><title><![CDATA[Your data journey: how to accelerate innovation?]]></title><description><![CDATA[In this article, we will look at how Microsoft Azure may help your organization innovate by using technologies, data, AI algorithms and machine learning.]]></description><link>https://www.luukmager.com/your-data-journey-how-to-accelerate-innovation/</link><guid isPermaLink="false">60dda8a7d558be24203ea9f6</guid><category><![CDATA[Azure]]></category><category><![CDATA[Azure Synapse Analytics]]></category><category><![CDATA[Data]]></category><category><![CDATA[Machine Learning]]></category><category><![CDATA[Power BI]]></category><dc:creator><![CDATA[Luuk Mager]]></dc:creator><pubDate>Thu, 01 Jul 2021 11:51:14 GMT</pubDate><media:content url="https://www.luukmager.com/content/images/2021/07/john-adams-1xIN4FMR78A-unsplash-1.jpg" medium="image"/><content:encoded><![CDATA[<div class="kg-card-markdown"><img src="https://www.luukmager.com/content/images/2021/07/john-adams-1xIN4FMR78A-unsplash-1.jpg" alt="Your data journey: how to accelerate innovation?"><p><img src="https://www.luukmager.com/content/images/2021/07/john-adams-1xIN4FMR78A-unsplash.jpg" alt="Your data journey: how to accelerate innovation?"><br>
<small>Photo by <a href="https://unsplash.com/@johnladams?utm_source=ghost&amp;utm_medium=referral&amp;utm_campaign=api-credit">John Adams</a> / <a href="https://unsplash.com/?utm_source=ghost&amp;utm_medium=referral&amp;utm_campaign=api-credit">Unsplash</a></small></p>
<p>When it comes to innovation, leading companies throughout the world are openly communicating about it. Variables differ, but they all have an innovative culture, as well as agility and speed in common. The last two, in particular, are critical for any organization's ability to innovate. But where do you begin? We are fortunate to have new technologies, such as the cloud, that encourage innovation. In this article, we will look at how Microsoft Azure may help your organization innovate by using technologies, data, AI algorithms and machine learning.</p>
<h3 id="challengesinstartingyourdatajourney">Challenges in starting your data journey</h3>
<p>Data has become a critical component of the decision-making process. But how can an organization become more data-driven, enabling it to be more agile and innovative? There is no way around the cloud as the foundation for a secure and governed data platform. We often see that businesses are frequently faced with a dilemma: whether to move fast or remain secure. The ability to do both is provided by establishing a cloud foundation.</p>
<p>Despite the fact that analytics and AI are the most important investments for company leaders, they face these challenges in becoming more data-driven:</p>
<ul>
<li>How to get the best return on investment;</li>
<li>Cultural difficulties;</li>
<li>How to become experienced data users;</li>
<li>And how to get rid of data silos.</li>
</ul>
<p><em>Read the Harvard Business Review Analytic Services report, <a href="https://www.readitquik.com/white-paper/business-support-solutionsservices/understanding-why-analytics-strategies-fall-short-for-some-but-not-others/">Understanding Why Analytics Strategies Fall Short for Some, but Not Others</a>.</em></p>
<h3 id="twocriticalanalyticssystems">Two critical analytics systems</h3>
<p>Another challenge is that businesses are forced to maintain two critical, yet independent analytics systems: a data warehouse and a data lake. Every well-known company has a data warehouse with structured relational data. But do you want to experiment with big data and apply data science? Then a data lake is essential for exploring unstructured data and using algorithms to search for relationships. In Azure Synapse you find both a data warehouse and data lake to start your data journey.</p>
<h3 id="whatisazuresynapse">What is Azure Synapse?</h3>
<p>Azure Synapse, Microsoft’s PaaS (Platform as a Service) offering, integrates all your data (on-premises, cloud and streaming data) in one single environment.</p>
<p><img src="https://www.luukmager.com/content/images/2021/07/AzureSynapse.png" alt="Your data journey: how to accelerate innovation?"></p>
<p>In the platform you will find three unique components:</p>
<ol>
<li>Business-critical data and dashboards, supported by data warehousing;</li>
<li>Data lakes to clean and explore (un)structured data;</li>
<li>Power BI (seamless integration) to create interactive data representations.</li>
</ol>
<p>If you are using Synapse and have a proper dataset, you can start using Azure machine learning. So, to make data-driven decisions, you do not have to make a lot of custom developments or use lots of different tools. It all comes together in Azure Synapse.</p>
<h3 id="advantagesofdataanalyticsinazure">Advantages of data analytics in Azure</h3>
<p>When you have that data-foundation on Microsoft Azure, you can start your data analytics. 85% of those who had already done so, agreed or strongly agreed that well-integrated analytics databases and storage, a data management stack and business intelligence tools, were beneficial to their organization. Furthermore, customer satisfaction increased by 60% and they experienced 27% faster time to insights by using data. Finally, data can support businesses in cutting costs by 26% (<a href="https://azure.microsoft.com/en-us/resources/forrester-tei-microsoft-azure-analytics-with-power-bi/">see this study</a>).</p>
</div>]]></content:encoded></item><item><title><![CDATA[5 reasons why you should use a Cloud Adoption Framework]]></title><description><![CDATA[Do you already know about the Microsoft Cloud Adoption Framework? This blog gives five reasons why you definitely should use a Cloud Adoption Framework.]]></description><link>https://www.luukmager.com/5-reasons-why-you-should-use-a-cloud-adoption-framework/</link><guid isPermaLink="false">608fde6b6e25911784155c15</guid><category><![CDATA[Microsoft]]></category><category><![CDATA[Azure]]></category><category><![CDATA[Cloud Adaption]]></category><dc:creator><![CDATA[Luuk Mager]]></dc:creator><pubDate>Mon, 03 May 2021 11:40:43 GMT</pubDate><media:content url="https://www.luukmager.com/content/images/2021/05/brian-patrick-tagalog-DH4dJW3cw74-unsplash-SMALL.jpg" medium="image"/><content:encoded><![CDATA[<div class="kg-card-markdown"><img src="https://www.luukmager.com/content/images/2021/05/brian-patrick-tagalog-DH4dJW3cw74-unsplash-SMALL.jpg" alt="5 reasons why you should use a Cloud Adoption Framework"><p>Do you already know about the Microsoft Cloud Adoption Framework? It is a proven guidance designed to assist you in developing and implementing the business and technology strategies needed for your company to be successful in the cloud. The framework offers best practices, data and resources that cloud developers, IT experts and business decision makers need to achieve their desired business outcomes. This standardized approach also helps to ensure greater coordination between company and technological strategies. Here are five reasons why you definitely should use a Cloud Adoption Framework.</p>
<p><img src="https://www.luukmager.com/content/images/2021/05/caf.png" alt="5 reasons why you should use a Cloud Adoption Framework"></p>
<h2 id="1governancebudgetarycontrolandaccessrights"><strong>1. Governance: budgetary control and access rights</strong></h2>
<p>One of the most critical aspects of your IT environment is keeping a tight grasp on the budget. In the cloud this may be even more relevant. The Cloud Adoption Framework enables you to apply governance at scale to your cloud environment in a controlled manner, preventing unwelcome surprises on your monthly statement.</p>
<p>When you grant application teams complete freedom, you may end up with very expensive cloud resources. While that is not always necessary. The framework helps clarify the cost flows, establishes budgets, directs access rights and restricts the use of certain resources when they are not in line with your policy. This way, you can keep a close eye on costs and access management within your cloud platform.</p>
<h2 id="2cloudadoptionframeworkensuressecurityandcompliance"><strong>2. Cloud Adoption Framework ensures security and compliance</strong></h2>
<p>The more you work in the cloud, the more important it is to work safely and in accordance with GDPR. The framework gives you guidance in setting up (policy driven) guard rails on your cloud environment. By doing this, all your cloud solutions are secure and compliant by default.</p>
<p>In ‘old fashioned’ IT, environment security was implemented at the highest level. It was difficult to apply different levels of security for different solutions. This led to (unnecessary) high costs and probably a degraded level of functionality and usability. The Cloud Adoption Framework guides you in setting up an application centric approach. Now, you can work with a standard level of security that applies to all applications and then add additional security measures for specific components. This is referred to as layered defence. Within that framework, you can give the application team complete freedom to do their work.</p>
<h2 id="3aholisticapproachwiththecloudadoptionframework"><strong>3. A holistic approach with the Cloud Adoption Framework</strong></h2>
<p>Cloud adoption starts well before the selection of a cloud application provider. It begins when business and IT decision makers understand that the cloud will assist them in accomplishing a particular business transformation goal. Therefore, the platform is a full lifecycle framework for cloud adoption which goes beyond technology. From defining your strategy and objectives to actually setting up your cloud environment and providing code templates: the Cloud Adoption Framework assists organizations in aligning their business, culture and technological transformation in order to successfully achieve short-term and long-term objectives.</p>
<h2 id="4applicableforlargeandsmallcompanies"><strong>4. Applicable for large and small companies</strong></h2>
<p>The overall Cloud Adoption Framework is an extensive system with many processes. As an enterprise organization, you can use the entire framework in your organization and cloud environment from start to finish. However, even as a small business, you can easily use an aspect of the framework and extend it to more elements. We also use the Cloud Adoption Framework at a large retailer to set up the Azure platform, but also at smaller businesses, for example, to organize security. The framework has been demonstrated to be appropriate for any type of organization of any scale.</p>
<h2 id="5quickresultsusingthecloudadoptionframework"><strong>5. Quick results using the Cloud Adoption Framework</strong></h2>
<p>When you use the framework, you do not have to reinvent the wheel when it comes to configuring your Azure platform. Furthermore, you can be sure that you are using the cloud in the way that Microsoft intended and that best practices are being followed. We also use the Cloud Adoption Framework templates and specify them for the customer, which saves a lot of time. As a result, you have a cloud infrastructure that is optimally built and that allows you to get the most out of it. You are capable of being innovative with a scalable and future-oriented platform without compromising security, costs and privacy.</p>
</div>]]></content:encoded></item><item><title><![CDATA[Visualize real-time sensor data using Microsoft Azure]]></title><description><![CDATA[In this post I want to show how easy it is to setup a live stream of sensor data on the Azure platform and visualize this in Power BI.]]></description><link>https://www.luukmager.com/visualize-real-time-sensor-data-using-microsoft-azure/</link><guid isPermaLink="false">5be06835cb15263ea41988bf</guid><category><![CDATA[Azure]]></category><category><![CDATA[IoT Hub]]></category><category><![CDATA[Stream Analytics]]></category><category><![CDATA[Power BI]]></category><dc:creator><![CDATA[Luuk Mager]]></dc:creator><pubDate>Mon, 05 Nov 2018 16:06:23 GMT</pubDate><media:content url="https://www.luukmager.com/content/images/2018/11/overview-1.png" medium="image"/><content:encoded><![CDATA[<div class="kg-card-markdown"><img src="https://www.luukmager.com/content/images/2018/11/overview-1.png" alt="Visualize real-time sensor data using Microsoft Azure"><p><img src="https://www.luukmager.com/content/images/2018/11/overview.png" alt="Visualize real-time sensor data using Microsoft Azure"></p>
<p>In this post I want to show how easy it is to setup a live stream of sensor data on the Azure platform and visualize this in Power BI.<br>
I used a MXChip as sensor and let it send temperature and humidity data, but you’re free to use any other device to send the data.</p>
<p><strong>Prerequisites</strong><br>
•	Active Azure Subscription<br>
•	Power BI account</p>
<p><strong>IoT Hub</strong><br>
First create an IoT Hub. For running this example, you can use the Free tier (F1). After the IoT Hub is created add a device (on the menu go to Explorers &gt;&gt; IoT Devices). Use the connection string of your added device to connect to the IoT Hub.</p>
<p>If you’re using the MXChip you can follow the getstarted tutorial (<a href="https://microsoft.github.io/azure-iot-developer-kit/docs/get-started/">https://microsoft.github.io/azure-iot-developer-kit/docs/get-started/</a>) to setup your device and connect it to the IoT Hub.</p>
<p><strong>Stream Analytics Job</strong><br>
The next step is to setup a Stream Analytics Job. This job will send the data from the IoT Hub to Power BI.<br>
From the menu in de Stream Analytics Job use the Job topology items.</p>
<p><img src="https://www.luukmager.com/content/images/2018/11/streamanalytics_menu.png" alt="Visualize real-time sensor data using Microsoft Azure"></p>
<ol>
<li>Create a stream input (choose IoT Hub) and select the IoT Hub which receives the sensor data.</li>
<li>Create a stream output (choose Power BI). Fill the required field and use the ‘Authorize’ button to connect to your Power BI account.</li>
<li>Create a query to read the data from the input into the output.<br>
For example:</li>
</ol>
<pre><code>SELECT
    messageId, EventEnqueuedUtcTime, humidity, temperature
INTO
    [OUTPUT ALIAS]
FROM
    [INPUT ALIAS]
WHERE humidity IS NOT NULL AND temperature IS NOT NULL
</code></pre>
<ol start="4">
<li>If your device is connected and sending data to the IoT Hub, start the Stream Analytics Job.</li>
</ol>
<p><em>Stream Analytics query language is a subset of standard T-SQL syntax for doing Streaming computations. For a full list of the support syntax features see:<br>
<a href="https://docs.microsoft.com/en-us/stream-analytics-query/stream-analytics-query-language-reference">https://docs.microsoft.com/en-us/stream-analytics-query/stream-analytics-query-language-reference</a></em></p>
<p><strong>Power BI</strong><br>
Open Power BI online (<a href="https://powerbi.microsoft.com">https://powerbi.microsoft.com</a>) using the same account as provided by the stream output in the previous steps. Navigate to the Datasets. When the Stream Analytics Job is set up correctly and your sensor is sending data the streaming dataset should be visible.</p>
<p><img src="https://www.luukmager.com/content/images/2018/11/powerbi_dataset.png" alt="Visualize real-time sensor data using Microsoft Azure"></p>
<p>You can create a report using the icon on the right side of the dataset.<br>
Create a line chart. Depending on the fields your datasets contains you can create a chart like this:</p>
<p><img src="https://www.luukmager.com/content/images/2018/11/powerbi_chart.png" alt="Visualize real-time sensor data using Microsoft Azure"></p>
</div>]]></content:encoded></item><item><title><![CDATA[Epoch to DateTime in Date Lake Analytics]]></title><description><![CDATA[Use C# function to convert an Epoch time (Unix time) to a DateTime in Azure Data Lake Analytics.]]></description><link>https://www.luukmager.com/epoch-to-datetime-in-date-lake-analytics/</link><guid isPermaLink="false">5bd2dd43cb15263ea41988b4</guid><category><![CDATA[Azure]]></category><category><![CDATA[Data Lake]]></category><category><![CDATA[Date Lake Analytics]]></category><category><![CDATA[U-SQL]]></category><dc:creator><![CDATA[Luuk Mager]]></dc:creator><pubDate>Fri, 07 Sep 2018 09:24:00 GMT</pubDate><media:content url="https://www.luukmager.com/content/images/2018/10/Azure-Data-Lake-Analytics_COLOR.png" medium="image"/><content:encoded><![CDATA[<div class="kg-card-markdown"><img src="https://www.luukmager.com/content/images/2018/10/Azure-Data-Lake-Analytics_COLOR.png" alt="Epoch to DateTime in Date Lake Analytics"><p>This blog post will describe the way to convert an Epoch time (Unix time) to a DateTime using a C# function and how you can use this in Azure Data Lake Analytics.<br>
Unix time is defined as the number of seconds since midnight (UTC) on 1st January 1970.</p>
<p>The easiest way to work with Data Lake assemblies from Visual Studio is by installing the Data Lake Tools (see: <a href="https://docs.microsoft.com/en-us/azure/data-lake-analytics/data-lake-analytics-data-lake-tools-install">https://docs.microsoft.com/en-us/azure/data-lake-analytics/data-lake-analytics-data-lake-tools-install</a>).</p>
<p><strong>Prerequisites</strong><br>
This blog assumes that you have a valid Azure subscription with at least the following resources running:<br>
•	Data Lake Storage<br>
•	Data Lake Analytics</p>
<p><strong>.NET Class Library</strong><br>
First, we create a .NET class library which we later can use in Azure Data Lake Analytics. To do this follow the next steps.</p>
<ol>
<li>
<p>Create a new U-SQL Class Library (assuming Data Lake Tools are installed)<br>
<img src="https://www.luukmager.com/content/images/2018/10/01.png" alt="Epoch to DateTime in Date Lake Analytics"></p>
</li>
<li>
<p>Create a class and add the following function:</p>
<pre><code> public static DateTime UnixTimeStampToDateTime(long unixTimeStamp)
 {
     DateTime dateTimeResult = 
         new DateTime(1970, 1, 1, 0, 0, 0, 0, DateTimeKind.Utc);
     return dateTimeResult.AddSeconds(unixTimeStamp).ToLocalTime();
 }
</code></pre>
</li>
</ol>
<p><strong>Date Lake Analytics</strong><br>
The next step is to register the assembly in Data Lake Analytics.</p>
<ol>
<li>Create a database to save the created .NET assembly. To do this run a job with the following script:</li>
</ol>
<pre><code>CREATE DATABASE IF NOT EXISTS ReferenceDB;
</code></pre>
<ol start="2">
<li>
<p>Deploying the library to Data Lake Analytics is easy using Visual Studio. Right click the project and select ‘Register’. Select your Data Lake Analytics account and select the database you created in the previous step.</p>
</li>
<li>
<p>Submit the registration. After a successful registration your assembly is visible in the data explorer.<br>
<img src="https://www.luukmager.com/content/images/2018/10/02.png" alt="Epoch to DateTime in Date Lake Analytics"></p>
</li>
</ol>
<p>At this point you’re ready to use the created function in a U-SQL job.<br>
Create a new job and use the following query:</p>
<pre><code>REFERENCE ASSEMBLY ReferenceDB.EpochConvert;
 
@result = 
SELECT *
FROM (VALUES 
     (1530003614,
     EpochConvert.EpochConvert.UnixTimeStampToDateTime(1530003614))) AS vt(ts, dt);

OUTPUT @result
TO &quot;/Output/epochConversion.csv&quot;
USING Outputters.Csv();
</code></pre>
<p>This query generates two columns. One with the timestamp value and the other one with the converted value. The result is saves to a CSV file in the Data Lake Storage. You can browse the output by the Data Explorer.</p>
<p><img src="https://www.luukmager.com/content/images/2018/10/03.png" alt="Epoch to DateTime in Date Lake Analytics"></p>
</div>]]></content:encoded></item><item><title><![CDATA[Azure Information Protection labels in SharePoint Online]]></title><description><![CDATA[<div class="kg-card-markdown"><p>Want to use your Azure Information Protection labels in SharePoint Online? This blogpost describes the way to do this.</p>
<p><strong>Azure</strong><br>
Let’s start with some Azure Information Protection configuration within the Azure Portal.<br>
Navigate to your AIP policy and select Advanced Properties as show in the image below.</p>
<p><img src="https://www.luukmager.com/content/images/2018/09/01.png" alt="01"></p>
<p>Add the</p></div>]]></description><link>https://www.luukmager.com/azure-information-protection-labels-in-sharepoint-online/</link><guid isPermaLink="false">5b9a532cb94d4c13f49dfb42</guid><category><![CDATA[Azure]]></category><category><![CDATA[Azure Information Protection]]></category><category><![CDATA[Office 365]]></category><dc:creator><![CDATA[Luuk Mager]]></dc:creator><pubDate>Wed, 04 Jul 2018 09:04:00 GMT</pubDate><media:content url="https://www.luukmager.com/content/images/2018/09/azure-aip.png" medium="image"/><content:encoded><![CDATA[<div class="kg-card-markdown"><img src="https://www.luukmager.com/content/images/2018/09/azure-aip.png" alt="Azure Information Protection labels in SharePoint Online"><p>Want to use your Azure Information Protection labels in SharePoint Online? This blogpost describes the way to do this.</p>
<p><strong>Azure</strong><br>
Let’s start with some Azure Information Protection configuration within the Azure Portal.<br>
Navigate to your AIP policy and select Advanced Properties as show in the image below.</p>
<p><img src="https://www.luukmager.com/content/images/2018/09/01.png" alt="Azure Information Protection labels in SharePoint Online"></p>
<p>Add the following properties (name/ value):</p>
<ul>
<li>SyncPropertyState : TwoWay</li>
<li>SyncPropertyName : <em>Classification</em></li>
</ul>
<p>The value of the last property needs to be the same as the value for the title field in the global properties of the policy (see image below).</p>
<p><img src="https://www.luukmager.com/content/images/2018/09/02-1.png" alt="Azure Information Protection labels in SharePoint Online"></p>
<p><strong>SharePoint Online</strong><br>
The next part is to configure SharePoint Online. Navigate to the library of your choice and add a new choice column. The name of the column must again be the same as the title field (see the image above). In the list of choices enter the exact label names as defined in Azure.</p>
<p>That’s it! Now we have everything in place. Create a document and add it to the library. You will see the classification label both in Word and SharePoint Online.</p>
<p><img src="https://www.luukmager.com/content/images/2018/09/04.png" alt="Azure Information Protection labels in SharePoint Online"></p>
<p><strong>Remarks</strong><br>
This solution isn’t perfect there are some pitfalls:</p>
<ul>
<li>Setting the label from SharePoint Online will set the correct label, but it will not trigger the defined AIP protection options.</li>
<li>A document created directly in SharePoint Online will not set the column. If you open the document and change the label it will set the column correct. Also, if you save the document locally first and then upload it to SharePoint, the column is set.</li>
</ul>
<p>The integration between AIP and Office 365 will become better in the future, but for now this can surely be an useful solution.</p>
</div>]]></content:encoded></item></channel></rss>