Quantcast
Channel: Category Name
Viewing all 10804 articles
Browse latest View live

#ifdef WINDOWS – Dev Center Analytics with Hannah Jimma

$
0
0

Analytics are invaluable when it comes to continued success with Microsoft Store. Dev Center analytics enable developers to measure success and generate learnings about how users use their creations, and allow for targeted actions to increase reach and continuously improve the overall experience.

I met with Hannah Jimma, a program manager on the Microsoft Store team focusing on Dev Center analytics, and I learned about what analytics are available today and some new features the team is working on for the future. Among other things, we covered:

  •        Measuring success of acquisitions, including what data is available on the way developers are discovering your apps
  •        Understanding how developers are using your apps and quickly discover bugs and crashes
  •        Capturing feedback from your users and interacting with them
  •        Analyzing data that is specific to Xbox
  •        Trying out new features in the Dev Center with the Dev Center Insider Program

and much more. Check out the video above for the full overview and feel free to reach out on Twitter or in the comments!

The post #ifdef WINDOWS – Dev Center Analytics with Hannah Jimma appeared first on Building Apps for Windows.


Windows Template Studio 1.6 released!

$
0
0

We’re extremely excited to announce the Windows Template Studio 1.6!

In this release, we added in app scheme launch, finalized our work for localization, massive improvements in accessibility, and started our work for Visual Basic support.

What’s new:

Full list of adjustments in the 1.6 release, head over to WTS’s Github.

New Pages:

  • Image Gallery

New Features:

  • Drag and drop source
  • Web to App Link

Template improvements:

  • Minor tweaks for Fluent
  • Discovered a few pages that set background color

Improvements to the Wizard:

  • Lots of under the hood bug fixes and code improvements
  • Much more Visual Basic engine work
  • Work for supporting multiple projects in a single solution
  • Work to support Prism

How to get the update:

There are two paths to update to the newest build.

  • Already installed: Visual Studio should auto update the extension. To force an update, Go to Tools->Extensions and Updates. Then go to Update expander on the left and you should see Windows Template Studio in there and click “Update.”
  • Not installed: Head to https://aka.ms/wtsinstall, click “download” and double click the VSIX installer.

Known issue

We are tracking an issue (#1532) when uninstalling / upgrading where you may get an error of “A value for ‘Component’ needs to be specified in the catalog.”

If you get this error, we need logs to help track this with the help of the Visual Studio team. We don’t know how to reproduce it, but we know a few people have hit this scenario.

We have how to capture these logs in the tracking issue on GitHub.

What else is cooking for next versions?

We love all the community support and participation. In addition, here are just a few of the things we are currently building out that will in future builds:

  • Visual Basic support (In nightly)
  • Prism support (In nightly)
  • Improved update system to help increase speed of startup and file size download
  • Improved user interface in-line with Visual Studio
  • Continued refinement with Fluent design in the templates
  • Ink templates
  • Improved Right-click->add support for existing projects

With partnership with the community, we’ve will continue cranking out and iterating new features and functionality. We’re always looking for additional people to help out and if you’re interested, please head to our GitHub at https://aka.ms/wts. If you have an idea or feature request, please make the request!

The post Windows Template Studio 1.6 released! appeared first on Building Apps for Windows.

Take a Break with Azure Functions

$
0
0

So, it’s Christmas time. The office is empty, the boss is away, and you’ve got a bit of free time on your hands. How about learning a new skill and having some fun?

Azure Functions are a serverless technology that executes code based on various triggers (i.e. a URL is called, an item is placed on a queue, a file is added to blob storage, a timer goes off.) There’s all sorts of things you can do with Azure Functions, like running high CPU-bound calculations, calling various web services and reporting results, sending messages to groups – and nearly anything you can imagine. But unlike traditional applications and services, there’s no need to set up an application host or server that’s constantly running, waiting to respond to requests or triggers. Azure Functions are deployed as and when needed, to as many servers as needed, to meet the demands of incoming requests. There’s no need to set up and maintain hosting infrastructure, you get automatic scaling, and – best of all – you only pay for the cycles used while your functions are being executed.

Want to have a go and try your hand at the latest in web technologies? Follow along to get started with your own Azure Functions.

In this post I’ll show you how to create an Azure Function that triggers every 30 minutes and writes a note into your slack channel to tell you to take a break. We’ll create a new Function app, generate the access token for Slack, then run the function locally.

Prerequisites:

Create a Function App (Timer Trigger)

We all know how important it is to take regular breaks if you spend all day sitting at a desk, right? So, in this tutorial, we’ll use a Timer Trigger function to post a message to a Slack channel at regular intervals to remind you (and your whole team) to take a break. A Timer Trigger is a type of Azure Function that is triggered to run on regular time intervals.

Just run it

If you want to skip ahead and run the function locally, fetch the source from this repo, insert the appropriate Slack channel(s) and OAuth token in the local.settings.json file, start the Azure Storage Emulator, then Run (or Debug) the Functions app in Visual Studio.

Step-by-step guide
  1. Open Visual Studio 2017 and select File->New Project.
  2. Select Azure Functions under the Visual C# category.
  3. Provide a name (e.g. TakeABreakFunctionApp) and press OK.
    The New Function Project dialog will open.
  4. Select Azure Functions v1 (.NET Framework), chose Timer trigger and press OK.
    Note: This will also work with Azure Functions v2, but for this tutorial I’ve chosen v1, since v2 is still in preview.

    New Timer Trigger

    A new solution is created with a Functions App project and single class called Function1 that contains a basic Timer trigger.

  5. Edit Function1.cs.
    • Add helper methods:
      • Env (for fetching environment variables)
      • SendHttpRequest (for sending authenticated http requests)
      • SendMessageToSlack (for generating and sending the appropriate Slack request – based on environment variables)
    • Update method: Run
      • Change the return type to async Task.
      • Add an asynchronous call to the SendMessageToSlack method.
      • Update Chron settings for the TimerTrigger attribute.
    • Add appropriate Using statements.

  6. The completed code should look like this:

    using System;
    using System.Net.Http;
    using System.Net.Http.Headers;
    using System.Threading.Tasks;
    using Microsoft.Azure.WebJobs;
    using Microsoft.Azure.WebJobs.Host;
    
    namespace TakeABreakFunctionsApp
    {
        public static class Function1
        {
            [FunctionName("Function1")]
            public static async Task Run([TimerTrigger("0 */30 * * * *")]TimerInfo myTimer, TraceWriter log)
            {
                log.Info($"C# Timer trigger function executed at: {DateTime.Now}");
                await SendMessageToSlack("You're working too hard. How about you take a break?", log);
            }
    
            private static async Task SendMessageToSlack(string message, TraceWriter log)
            {
                // Fetch environment variables (from local.settings.json when run locally)
                string channel = Env("ChannelToNotify");
                string slackbotUrl = Env("SlackbotUrl");
                string bearerToken = Env("SlackOAuthToken");
    
                // Prepare request and send via Http
                log.Info($"Sending to {channel}: {message}");
                string requestUrl = $"{slackbotUrl}?channel={Uri.EscapeDataString(channel)}&text={Uri.EscapeDataString(message)}";
                await SendHttpRequest(requestUrl, bearerToken);
            }
    
            private static async Task SendHttpRequest(string requestUrl, string bearerToken)
            {
                HttpClient httpClient = new HttpClient();
                httpClient.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", bearerToken);
                HttpResponseMessage response = await httpClient.GetAsync(requestUrl);
            }
    
            private static string Env(string name) => Environment.GetEnvironmentVariable(name, EnvironmentVariableTarget.Process);
        }
    }
  7. Edit local.settings.json.
    Add the following environment variables.
    • SlackbotUrl – The URL for the Slack API to post chat messages
    • SlackOAuthToken – An OAuth token that grants permission for your app to send messages to a Slack workspace.
      – See below for help generating a Slack OAuth token.
    • ChannelToNotify – The Slack channel to send messages to
  8. Your local.settings.json should look something like this:
    (Your SlackOAuthToken and ChannelToNotify variables will be specific to your Slack workspace.)

    {
      "IsEncrypted": false,
      "Values": {
        "AzureWebJobsStorage": "UseDevelopmentStorage=true",
        "AzureWebJobsDashboard": "UseDevelopmentStorage=true",
        "SlackbotUrl": "https://slack.com/api/chat.postMessage",
        "SlackOAuthToken": "[insert your generated token]",
        "ChannelToNotify": "[your channel id]"
      }
    }

Your Functions app is now ready to run! You just need to grab an authorization token for your Slack workspace.

Generate an OAuth token for your app to send messages to your Slack workspace

Before you can post a message to a Slack workspace, you must first tell Slack about the app and assign specific permissions for the app to send messages as a bot. Once you’ve installed the app to the Slack workspace, you will be issued an OAuth token that you can send with your http requests. For full details, you can follow the instructions here. Otherwise, follow the steps below.

  • Click here to register your new Functions app with your Slack workspace.
  • Provide a name (e.g. “Take a Break”) and select the appropriate Slack workspace, then press Create App.
  • Create A Slack App

    When the app is registered with Slack, the Slack API management page opens for the new app.

  • Select OAuth & Permissions from the navigation menu on the left.
  • In the OAuth & Permissions page, scroll down to Scopes, select the permission chat:write:bot, then select Save Changes.
  • Select Permission Scopes

  • After the scope permissions have been created and the page has refreshed, scroll to the top of the OAuth & Permissions page and select Install App to Workspace.
  • Slack Install App to Workspace

  • A confirm page opens. Review the details, then click Authorize.
  • Your OAuth Access Token is generated and presented at the top of the page.
  • OAuth Access Token

  • Copy this token and add it to your local.settings.json as the value for SlackOAuthToken.

    Note: The OAuth access token is a secret and should not be made public. If you check this token into a public source control system like GitHub, Slack will find it and permanently disable it!

Run your Functions App on your local machine

Now that you’ve registered your app with Slack and have provided a valid OAuth token in your local.settings.json, you can run the Function locally.

Start the local Storage Emulator

You can configure your function to use a storage account on Azure. But if your app is configured to use development storage (which is the default for new Functions), then it will run against the local Azure Storage Emulator. Therefore, you’ll need to make sure the Storage Emulator is started before running your Functions app.

  • Open the Windows Start Menu and search for “Storage Emulator”.

Microsoft Azure Storage Emulator will launch. You can manage it via the icon in the Windows System Tray.

Start the Function app from Visual Studio
  • Press Ctrl+F5 to build and run the Functions app.
  • If prompted, update to the latest Functions tools.
  • A new command window launches and displays the log output from the Functions app.

Function App Running

After a certain period of time, the Timer trigger will fire and send a message to your Slack workspace.

Function Timer Executes

You should see the message appear in the appropriate Slack channel.

Message Appears In Slack

Feel free to play around with the Timer Chron options in the Run method’s attributes to configure the function to execute at the intervals you’d like. Here are some example Chron settings.
        Trigger Chron format: (seconds minutes hours days months years)
        (“0 */15 6-20 * * *”) = Every 15 minutes, between 06:00 AM and 08:59 PM
        (“0 0 0-5,21-23 * * *”) = Every hour from 12:00 AM to 06:00 AM and 09:00 PM to 12:00 AM

Congratulations! You’ve written a working Azure Functions App with a Timer trigger function.

What’s next?


Publish your Functions App to the cloud
So that your Functions app is always available, and can be accessed globally (eg. For Http trigger types), you can publish your app to the cloud. This article describes the process of publishing a Functions app to Azure.

Experiment with other Functions types
There’s an excellent collection of open-source samples available here. Poke around and see what takes your interest.

Tell us about your experience with Azure Functions
We’d love to hear about your experience with Azure Functions. If you’ve got a minute, please complete this short survey.
As always, feel free to leave comments and questions in the space below.

Happy holidays!

Justin Clareburt
Senior Program Manager
Visual Studio and .NET

Top stories from the VSTS community – 2017.12.22

$
0
0
Here are top stories we found in our streams this week related to DevOps, VSTS, TFS and other interesting topics. TOP STORIES How to setup a VSTS pipelines for Azure Service Fabric and its containing services – Clemens ReijnenSven wrote a detailed How To covering the creating of a release pipelines for a secure Azure... Read More

Because it’s Friday: Deck the Halls

$
0
0

Sure, this is a promo for a movie, but I'd love to have a full-length single of this:

Relatedly, if you want to settle an argument about which pop diva has the greatest vocal range, Giora Simchoni used  R to perform frequency analysis of their hits:

Whitney

That's all from us here at the blog for this week, and in fact for a little while: we're taking a break for the holidays. We'll be back on January 2, and in the meantime, enjoy!

How to set up a 10″ Touchscreen LCD for Raspberry Pi

$
0
0

HDMI TouchScreenI'm a big fan of the SunFounder tech kits (https://www.sunfounder.com), and my kids and I have built several Raspberry Pi projects with their module/sensor kits. This holiday vacation we have two project we're doing, that coincidentally use SunFounder parts. The first is the Model Car Kit that uses a Raspberry Pi to control DC motors AND (love this part) a USB camera. So it's not just a "drive the car around" project, it also can include computer vision. My son wants to teach it to search the house for LEGO bricks and alert an adult so they'll not step on it. We were thinking to have the car call out to Azure Cognitive Services, as their free tier has more than enough power for what we need.

For this afternoon, we are taking a 10.1" Touchscreen display and adding it to a Raspberry Pi. I like this screen because it works on pretty much anything that has HDMI, but it's got mounting holes on the back for any Raspberry Pi or a LattePanda or Beagle Bone. You can also use it for basically anything that can output HDMI, so it can be a small portable monitor/display for Android or iOS. It has 10 finger multitouch which is fab. The instructions aren't linked to from their product page, but I found them on their Wiki.

There are a lot of small LCDs you can get for a Pi project, from little 5" screens (for about $35) all the way up to this 10" one I'm using here. If you're going to mount your project on a wall or 3D print a box, a screen adds a lot. It's also a good way to teach kids about embedded systems. When my 10 year old saw the 5" screen and what it could do, he realized that the thermostat on the wall and/or the microwave ovens were embedded systems. Now he assumes every appliance is powered by a Raspberry Pi!

Sunfounder Controller board AND Raspberry Pi Mounted to the 10.1" Touchscreen Booting Windows 10 on a Raspberry Pi for no reason

Take a look at the pic at the top right of this post. That's not a Raspberry Pi, that's

the included controller board that interfaces with your tiny computer. It's include with the LCD package. That controller board also has an included power adapter that points out 12V at 1500Ma which allows it to also power the Pi itself. That means you can power the whole thing with a single power adapter.

There's also an optional touchscreen "matchbox" keyboard package you can install to get an on-screen visual keyboard. However, when I'm initially setting up a Raspberry Pi or I'm taking a few Pis on the road for demos and working in hotels, I through this little $11 keyboard/mouse combo in my bag. It's great for quick initial setup of a Raspberry Pi that isn't yet on the network.

Matchbox Touchscreen Keyboard

Once you've installed matchbox-keyboard you'll find it under MainMenu, Accessories, Keyboard. Works great!

* This post includes some referral links to Amazon.com. When you use these links, you not only support my blog, but you send a few cents/dollars my way that I use to pay for hosting and buy more gadgets like these! Thanks! Also, I have no relationship with SunFounder but I really like their stuff. Check out their site.


Sponsor: Scale your Python for big data & big science with Intel® Distribution for Python. Near-native code speed. Use with NumPy, SciPy & scikit-learn. Get it Today!



© 2017 Scott Hanselman. All rights reserved.
     

Merry Christmas and Happy New Year!

$
0
0

The Revolutions team is celebrating Christmas today, and we're taking a break with family and enjoying good food. And given the number of Eggnogs that are being prepared — thanks to Hadley Wickham's eggnogr Shiny app — it might be a good idea to take the rest of the week off as well. (You can find the R source behind the eggnogr app here.)

Eggnogr

We'll be back again in the New Year, on January 2. Enjoy the holiday season, and a happy New Year to all.

Top stories from the VSTS community – 2017.12.29

$
0
0
Here are top stories we found in our streams this week related to DevOps, VSTS, TFS and other interesting topics. TOP STORIES My Top 5 Blog Posts of 2017 – Mike Douglas2017 was an exciting year full of DevOps and Azure projects, learning, and sharing. Planning, Scheduling and Executing – A Deeper Look [Part 1]... Read More

Do you have bad R habits? Here’s how to identify and fix them.

$
0
0

RStudio's Jenny Bryan (whose recent interviews here and here you should definitely check out) has some excellent advice for improving your workflow for R:

Use. Projects. — @JennyBryan at #EARLConf2017 #rstats pic.twitter.com/r4a08JhWHT

— David Smith (@revodavid) September 13, 2017

If you're you're routinely using setwd to manually change R's working directory, or using rm(list = ls()) to 'reset' your R session, that's a good sign that you're doing all of your R work in the default R workspace. It's a much better idea to set up a separate project (at its simplest, a separate directory) for each distinct piece of work. The default R interface makes this a bit tricky, but IDEs like RStudio and RTVS are designed to make this trivial, and you'll notice an immediate impact in the organization and reproducibility of your work.

And, you won't incur the Wrath of Jenny. Find her complete guide, chock-full with useful advice, at the link below.

Tidyverse: Project-oriented workflow

Last week in Azure: Azure HDInsight, Azure Cosmos DB, and more

$
0
0

Welcome to 2018! This edition of Last Week in Azure covers the final two weeks of 2017. Just before everyone disappeared for the holiday break here in Redmond, the news focused on Azure HDInsight.

Azure HDInsight

We lowered prices and raised capabilities for Azure HDInsight. Some capabilities of HDInsight, such as Apache Kafka and Azure Log Analytics integration, reached general availability, and the Enterprise Security Package released in Preview. You can also learn about how Xbox uses HDInsight to spelunk through a mountain of gaming data, and then check out how Fast SQL query processing in HDInsight performs at scale against industry standard TPCDS benchmarks.

You can get all of the details in the following Azure HDInsight posts:

Data

New connectors available in Azure Data Factory V2 - A list of over 25 new data connectors to enable copying data from additional data stores to Azure Data Factory, including Concur, PayPal, and Square.

Azure #CosmosDB: Entry point for unlimited containers is now 60% cheaper and other improvements - Some changes rolled out in December include: the entry point for unlimited collections/containers is now 60% cheaper, provisioning massive amounts of throughput is now completely frictionless, and Azure Cosmos DB is now available as a part of Microsoft Azure in France.

Join on-demand recasts of Azure Cosmos DB and Azure SQL Database webinars! - Check out two webinars now available for on-demand viewing. The first webinar looks at Azure Cosmos DB, a globally distributed, multi-model database service that enables scaled throughput and storage across many geographical regions. The second webinar shares how you can make the most of Azure SQL Database's machine learning features to deliver intelligent apps to your customers.

IaaS

VMWare virtualization on Azure - Learn how we are enabling VMWare virtualization in preview by working with multiple VMware Cloud Provider Program partners and running on existing VMware-certified hardware.

Benefits of migrating IaaS to Azure Resource Manager - Learn about the capabilities & features of running your IaaS resources on Azure Resource Manager. If you're running your IaaS resources using the predecessor classic stack in Azure, check out the no-downtime migration service that's available to move your IaaS resources to Azure Resource Manager.

Azure Backup now supports BEK encrypted Azure virtual machines - Get a simplified experience and enhanced security to create any kind of encrypted virtual machine and then back it up using Azure Backup. Back-up virtual machines encrypted using BEK-only as well as BEK and KEK for both managed and unmanaged disks.

Other Headlines

Bring your own vocabulary to Microsoft Video Indexer - Adaptation for speech recognition is necessary to teach the automatic speech recognition (ASR) system new words and how you use them. The adaptation technology in Video Indexer takes nothing but adaptation text and modifies the language model in the ASR system to make more intelligent guesses for a given domain. This is useful to teach your system acronyms, new words, and new uses of known words.

GDPR How-to: Get organized and implement the right processes - Microsoft France recently published a detailed implementation guide that's available in both English and French. The guide provides a methodology for creating and executing your own GDPR compliance program.

Announcing the preview release of subscription level budgets via ARM APIs - Budgets at the subscription level is only the first step in a number of releases planned to improve the cost management experience. While we work on an in-portal experience, learn how you can use a preview of the Budgets API to set up a budget for a subscription, and also set up multiple notification thresholds.

Reference Architecture and automation for Financial Services web applications - Download the new Financial Services Regulated Workloads Blueprint from the Azure Security and Compliance Blueprint Program for an automated solution that will help guide you in storing and managing sensitive financial information such as payment data in Azure.

How Azure Security Center detects vulnerabilities using administrative tools - In December, Azure Security Center released a new analytic to detect suspicious account creation operations that might be used as backdoor accounts. This analytic includes several criteria to distinguish between benign administrative account creation and suspicious activities for triggering an alert.

New Alerts (preview) in Azure Monitor - Improve your monitoring experience with Azure Monitoring Services, which includes the ability to set up alerts for monitoring the metrics and log data for the entire stack across your infrastructure, application, and platform.

Announcing the public preview for Adaptive Application Controls - Adaptive Application Controls leverages machine learning to analyze the behavior of your Azure virtual machines, create a baseline of applications, group the virtual machines, and recommend and automatically apply the appropriate whitelisting rules. See how you can view, modify, and receive alerts for these rules in Azure Security Center.

4 tips for keeping your resolution to learn Azure - If you're a developer, set yourself up for success with my 4 tips on how to learn Azure in the new year.

Service updates

Azure shows

Apache Kafka on HDInsight - Raghav Mohan joins Scott Hanselman to talk about Apache Kafka on HDInsight, which added the open-source distributed streaming platform last year to complete a scalable, big data streaming scenario on Azure. The service has come a long way since - processing millions of events/sec, petabytes of data/day to power scenarios like Toyota's connected car, Office 365's clickstream analytics, and fraud detection for large banks. Deploy managed, cost-effective Kafka clusters on Azure HDInsight with a 99.9% SLA with just 4 clicks or pre-created ARM templates.

Use the Azure portal to answer your billing questions - Tommy Nguyen joins Scott Hanselman to discuss Billing in the Azure portal. If you're wondering: "Where do I find a copy of my invoice?" or "How does an Azure service affect my overall costs?" - these questions and more will be answered in this episode, which highlights features in the Azure portal to get cost and billing clarity.

The Azure Podcast: Episode 209 - DevOps with Kubernetes - As part of our Partner Spotlight series, we have Dan Garfield from CodeFresh.io and Jessica Deen, a Cloud Developer Advocate at Microsoft talking about DevOps in a Kubernetes world. Good timing too considering we recently did a show on Kubernetes!.

Azure #CosmosDB: Recap of 2017

$
0
0

2017 - Cosmos DB

It was a pretty amazing year for Azure Cosmos DB, highlighted by the launch of the service, the preview and general availability of Graph, Table APIs, MongoDB API, native Spark connector and many other awesome capabilities. There were breakthroughs in making our SLAs truly industry-leading, adding native integration with Azure Functions and launching Try Cosmos DB for free to empower anyone to experience and play with our service without having an Azure account or having to specify a credit card. Below is a look back at 2017 and some of the memorable milestones.

The Launch of Azure Cosmos DB

imageIn May, we were super excited to announce the general availability of Azure Cosmos DB – Microsoft’s globally distributed, massively scalable, multi-model database service. It is the first globally-distributed data service that lets you elastically scale throughput and store across any number of geographical regions while guaranteeing low latency, high availability and consistency. While also backed by the most comprehensive SLAs in the industry. It is the first cloud database to natively support a multitude of data models and popular query APIs. It is built on a novel database engine capable of ingesting sustained volumes of data and provides blazing fast queries. All this without having to deal with schema or index management. And it is the first cloud database to offer five well-defined consistency models so you can choose just the right one for your app. Read more about Azure Cosmos: The Industry’s first globally-distributed, multi-model database service and relive the launch by watching the #MSBuild Day 1 Keynote video.

Technical rigor behind Azure Cosmos DB

imageThe work of Dr. Leslie Lamport, Turing Award Winner and a world-renowned computer scientist has profoundly influenced many large-scale distributed systems. Azure Cosmos DB is no exception. Over the course of the seven years building Azure Cosmos DB, Leslie’s work has been a constant source of our inspiration and a solid foundation for our platform. Since its inception, Azure Cosmos DB has offered developers a choice between five well-defined consistency models along the consistency spectrum – strong, bounded staleness, session, consistent prefix, and eventual. You can configure the default consistency level on your Azure Cosmos DB account and later override the consistency on a specific read request. You can read all about the consistency levels in Azure Cosmos DB in the article Tunable data consistency levels in Azure Cosmos DB. To create these five consistency levels and build many of the capabilities within Azure Cosmos DB, we married decades-worth of distributed systems and database research with world-class engineering rigor. In this interview with Leslie shares his thoughts on the foundations of Azure Cosmos DB, his influence in the design of Azure Cosmos DB and in particular on the modelling of consistency choices.

Native Spark Connector for Azure Cosmos DB

imageIn June, we announced the preview of the native Spark connector for Azure Cosmos DB that provides seamless interaction with globally-distributed, multi-model data and empowers customers to easily run analytics, machine learning and data science on top of globally-distributed operational data. Spark connector for Azure Cosmos DB enables real-time data science, machine learning, advanced analytics and exploration over globally distributed data in Azure Cosmos DB by connecting it to Apache Spark. The connector efficiently exploits the native Azure Cosmos DB managed indexes and enables updateable columns when performing analytics.  It also utilizes push-down predicate filtering against fast-changing globally-distributed data addressing a diverse set of IoT, data science, and analytics scenarios. Our goal is to help developers write globally distributed apps more easily using the tools and APIs they are already familiar with. Azure Cosmos DB’s database engine natively supports SQL API, MongoDB API, Gremlin (graph) API, and Azure Table API. With the updated Spark connector for Azure Cosmos DB, Apache Spark can now interact with all Azure Cosmos DB data models: Documents, Tables, and Graphs. Read more about Spark Connector for #CosmosDB seamless interaction with globally-distributed, multi-model data and Azure #CosmosDB announcements.

Azure Cosmos DB Change Feed

Azure Cosmos DB is extremely well-suited for IoT, gaming, retail, and operational logging applications. imageA common design pattern in these applications is to use changes to the data to trigger additional actions. To help you build powerful applications on top of Azure Cosmos DB we built change feed support, which provides a sorted list of documents within a collection in the order in which they were modified. To address scalability while preserving simplicity of use, we introduced the Azure Cosmos DB Change Feed Processor Library. Read more to learn when and how you should use Change Feed Processor Library.

Azure Cosmos DB – database for serverless era image

We are now entering the era of serverless. With serverless architecture, there is a whole new range of applications emerging, in particular serverless event-driven apps. When the event is triggered your application pops into action, does what it needs to do and disappears again. That’s why we start calling it “severless” because in essence, your server does not exist until it is needed. In September, we announced the availability of native integration between Azure Cosmos DB and Azure Functions. With this native integration, you can create database triggers, input bindings, and output bindings directly from your Azure Cosmos DB account. Using Azure Functions and Azure Cosmos DB, you can create and deploy event-driven serverless apps with low-latency access to rich data for a global user base. Watch this demo on Serverless Apps with Azure Cosmos DB & Azure Functions at Microsoft Ignite.

Try Azure Cosmos DB for free

imageThis year we launched Try Azure Cosmos DB for Free, an experience that allows anyone to play and build with Azure Cosmos DB without signing up for Azure or requiring a credit card, all absolutely free for a limited time. Why did we launch Try Cosmos DB for free? It’s simple. We want to make it easy for developers to evaluate Azure Cosmos DB, build and test their app against Azure Cosmos DB, do a hands-on-lab, a tutorial, create a demo or perform unit testing without incurring any costs. Our goal is to enable any developer to easily experience Azure Cosmos DB and what it has to offer, become more comfortable with our database service and build the expertise with our stack at zero cost. With Try Cosmos DB for free, you can go from nothing to a fully running planet-scale Azure Cosmos DB app in less than a minute. Read more on how to Try Azure #CosmosDB for free.

image12 Months of free access to Azure Cosmos DB

In September, Azure announced Azure free account which gives you $200 in Azure credits for the first 30 days and a limited quantity of free services for 12 months. For more information, see Azure free account. Azure Cosmos DB is one of the popular products that are a part of Azure free account and is accessible to you for 12 months for free. To start, create your Azure free account today.

20 days of Azure Cosmos DB Tips Series

imageIn September, we ran the series called The 20 days of Azure Cosmos DB tips by Simona Cotin on twitter to share our tips on using Azure Cosmos DB. You may wish they were all available in one place. Well, they are in 20 days of Azure Cosmos DB Tips!

Azure Cosmos DB metrics, Azure Monitor, OMS, and moreimage

Transparency is an important virtue of any cloud service. Azure Cosmos DB is the only cloud service that offers 99.990% SLAs on read availability, throughput, consistency, and <10ms latency, and we transparently show you metrics on how we perform against this promise. This year, we have made a number of investments to help developers monitor and troubleshoot their Azure Cosmos DB workloads. In September, we announced several metrics we had added to the service, as well as integration of our metrics with popular Azure Monitor, OMS, and a preview of Diagnostics logs. Get insights into your Azure #CosmosDB: partition heatmaps, OMS, and more.

Azure Cosmos DB Cassandra API Preview

imageAt Connect, we launched the preview of native support for Apache Cassandra API – offering you Cassandra as-a-service powered by Azure Cosmos DB. You can now experience the power of Azure Cosmos DB platform as a managed service with the familiarity of your favorite Cassandra SDKs and toolchain. Sign up today and access the Azure Cosmos DB Cassandra API to easily build planet-scale Cassandra apps. You can now experience the power of Azure Cosmos DB platform as a managed service with the familiarity of your favorite Cassandra SDKs and tools—without any app code changes.

image

image99.999% read availability at global scale

We continuously improve our stack and in November we were really excited to announce even stronger SLAs for Azure Cosmos DB - now databases spanning multiple regions will have 99.999 percent read availability. Learn more by reading Azure Cosmos DB SLA page.

General availability of Azure Cosmos DB Table APIimage

Azure Cosmos DB Table API became generally available this year. With the Azure Cosmos DB Table API your applications written for Azure Table storage can now leverage premium capabilities of Azure Cosmos DB, such as turnkey global distribution, low latency reads/writes, automatic secondary indexing, dedicated throughput, and much more. Read more about the general availability of Azure #CosmoDB Table API.

General availability of Azure Cosmos DB Gremlin (Graph) APIimage

Azure Cosmos DB Gremlin (Graph API) has also just become generally available. With general availability, we’ve delivered critical improvements to the performance of graph operations, improved import and backup scenarios through new tooling, and enhanced support for open-source frameworks recommended by Apache Tinkerpop, including Python client support. With GA, we will also simplify migration from popular database engines like TitanDB, Neo4j and others. Read more about the general availability of Azure #CosmosDB Gremlin (Graph) API.

General availability of MongoDB API and extended capabilities

imageThis year, MongoDB API became generally available as well as in November we’ve announced the public preview of unique indexes and aggregation pipeline support which allows Azure Cosmos DB developers using MongoDB API to perform data manipulation in multi-stage pipelines even within a single query, enabling streamlining development of more sophisticated aggregations. Unique index capability is now generally available and allows to introduce the uniqueness constraint on any document fields that are already auto-indexed in Azure Cosmos DB. Azure Cosmos DB now also implements MongoDB 3.4 wire protocol, allowing the use of tools and applications relying on it. Learn more about Azure #CosmosDB support for MongoDB.

Azure Cosmos DB in Azure Storage Explorer

imageWe announced the public preview of support for Azure Cosmos DB in the Azure Storage Explorer (ASE). With this release Azure Cosmos DB databases can be explored and managed with the same consistent user experiences that make ASE a powerful developer tool for managing Azure storage. The extension allows you to manage Azure Cosmos DB entities, manipulate data, create and update stored procedures, triggers, as well as User Defined Functions. Azure Storage Explorer not only offers unified developer experiences for inserting, querying, and managing your Azure Cosmos DB data, but also provides an editor with syntax highlighting and suggestions for authoring your Cosmos DB stored procedures. With this extension you are now able to browse Cosmos DB resources across both SQL (DocumentDB) and MongoDB interfaces along-side existing experiences for Azure Blobs, tables, files, and queues in ASE.

Entry point for unlimited collections/containers is now 60% cheaperimage

Last February, we lowered entry point for unlimited containers making them 75% cheaper. We continue making improvements in our service and in December announced that unlimited containers have now an entry point that is 60% cheaper than before. Instead of provisioning 2,500 RU/sec as a minimum, you can now provision an unlimited collection at 1,000 RU/sec and scale in increments of 100 RU/sec. Unlimited containers (collections) enable you to dynamically scale your provisioning from as low as 1,000 RU/sec to millions of RU/sec with no limit on storage consumption.

Azure Cosmos DB is in Azure France now

imageAzure Cosmos DB is a foundational service in Azure powering mission-critical applications, services and customer workloads around the world. Azure Cosmos DB is now a part of the preview of Microsoft Azure in France, which is now open to all customers, partners and ISVs worldwide giving them the opportunity to deploy services and test workloads against Azure Cosmos DB in these latest Azure regions. Azure Cosmos DB in the new regions will offer the same enterprise-grade reliability and performance with the industry-leading comprehensive SLAs to support the mission-critical applications and workloads of businesses and organizations in France. Sign up for Azure France preview.

2018 is going to be truly Cosmic!

image

Most of the above mentioned happened thanks to you - our customers, MVps, developers, partners and organically, as we worked hard to add value to our customers and empower anyone to build globally-distributed applications with ease. It took us years to operationalize the capabilities we currently offer. We were fortunate to have Internet scale Microsoft applications to learn from which allowed us to identify, validate, and operationalize the five consistency models, the industry-leading comprehensive SLAs, the guaranteed single-digit millisecond latency for reads and writes and robust partition management for independent scaleout of storage and throughput. We have been constantly working on optimizing the propagation delays across various workloads by dynamic replication topologies, made our software stack across a variety of failure modes and interconnectivity issues for both within and across, various Azure regions. We have constantly optimized the core replication protocol (we are not done yet) to bring all of this to you, our customers. Azure Cosmos DB is the database of the future - it is what we believe is the next big thing in the world of massively scalable databases! It makes your data available close to where your users are, worldwide and makes building planet scale apps super-easy using the API and data model of your choice.

A new year is on the way and the possibilities are endless. We can’t wait to share what we have in store for you in 2018 and see what you will build with Azure Cosmos DB this upcoming year2018 - Cosmos DB

Happy New Year!


- Your friends at Azure Cosmos DB (#CosmosDB, @AzureCosmosDB.)

Migration checklist when moving to Azure App Service

$
0
0

I have been continuously getting requests from customers, colleagues and partners around what to consider when migrating applications to Azure PaaS service but more specifically to the App Service.

This post tries to cover the majority of those cases and aims to provide a checklist and ready reckoner for customers/partners intending to migrate their existing applications to Azure App Service.

To start, let’s have a look at various considerations before you consider migrating your applications to Azure App Service

  • Port Bindings - Azure App Service support port 80 for http and port 443 for HTTPS traffic. If you have sites using any other port after migration to Azure App Service, do remember that these are the only ports that will be used.
  • Usage of assemblies in the GAC (Global Assembly Cache)- This is not supported. Consider bin placing the assemblies in the local bin.
  • IIS5 Compatibility Mode- IIS5 Compatibility Mode is not supported. In Azure App Service each Web App and all the applications under it run in the same worker process with a specific set of application pool settings.
  • IIS7+ Schema Compliance- One or more elements and/or attributes are being used which are not defined in Azure App Service IIS schema. Consider using XDT transforms.
  • Single Application Pool Per Site- In Azure App Service each Web App and all the applications under it, run in the same application pool. In case you have applications with different application pool in IIS, consider establishing a single application pool with common settings or creating a separate Web App for each application.
  • COM and COM+ components- Azure App Service does not allow the registration of COM components on the platform. If your site(s) or application(s) make use of any COM components, these would need to be rewritten in managed code and deployed with the site or application.
  • ISAPI Extensions- Azure App Service can support the use of ISAPI Extensions, however, the DLL(s) need to be deployed with your site and registered via the web.config.

Once the above limitations have been taken into consideration, you will need to migrate your applications. The easiest form of migrating is through Azure App Service Migration Assistant. The Azure App Service Migration site and the tool can be utilized to migrate sites from Windows and Linux web servers to Azure App Service. As part of the migration the tool, this will create Web Apps and databases on Azure and publish content and publish your database.

This tool is available for both Windows server and Linux servers. The migration tool for Windows Server works either from the local machine or from a remote machine. It allows you to migrate sites from IIS running on Windows Server 2003 onwards.

Please refer to Windows Site Migration Tool for details.

The Linux site migration tool allows you to migrate sites from Linux web servers running Apache to the cloud. Only Apache is supported at this point in time.

Please refer to Linux Site Migration Tool for details.

Once you have decided to migrate, the following areas need to be considered for migrating applications to Azure App Service.

  • On-premises integration - In case your applications are communicating with other applications which will not be migrated to Azure, you have to consider how the communication will happen when your application moves to Cloud. One solution is to enable the other application to communicate over the internet using REST. This may require changes in both the applications, not to mention the additional risk of exposing the server onto internet. Another approach would be to establish a secure connectivity to your on-premises server from Azure App Service, where your application is hosted. This can be done in any of the following ways depending on your requirement – Deploying your apps in an App Service Environment using an Isolated App Service Plan; enabling VNET integration with an Azure VNet , establish a Site to Site VPN between this Azure VNET and on-premises, and then enable routes between your App Service and the on-premises VM; and establishing hybrid connections. A detailed comparison of all the approaches would warrant a separate blog post.
  • Authentication – When on-premises, you could be okay with no-authentication or windows authentication as there was mutual trust with AD. When you migrate to Azure you will need to enable authentication with Azure Active Directory. This means modifying some of your configuration to be able to authenticate your users via Azure AD. Complete details here on our documentation site.
  • Session State – In an ideal case, you can make your application stateless in order to scale/switch at will. In case it is not possible, please have your session state configured to be persisted in Azure Redis Cache.
  • File Persistence – Usually, websites might have a need for uploading files that need to be persisted. On Azure App Service, it is recommended to persist any files outside of the App Service into say something like a blob store. Modify the application to now use either the Azure Storage SDK or the REST APIs for saving and accessing files.
  • App Settings/Connection Strings – There will be App Settings and Connection Strings that will change based on environment, whereas there will be somethings that will stay same. For the ones that change based on environment, also define them on the portal/template so that they can be overridden for different deployment slots.
  • Logging –  If your logging framework is logging to files saved locally you will need to update them to either log it to Azure Diagnostics, or to a centralized blob store. You can also include Azure App Insights to get deeper insights into how your application is performing.
  • Certificates – Certificates are not migrated directly. You will need to explicitly upload your certificates to be able to work on Azure. Details here in Bind SSL Certificate documentation. You can also purchase certificates directly from Azure. Details here in buy SSL cert documentation.
  • Custom Domains – Custom domains can be associated to Azure Web Apps via a CNAME record change. You also need to update App Service to validate the DNS. Details here in map custom domain documentation.
  • Email – Sending Emails requires an SMTP server. App Service does not provide you with the same and there is no way that you can configure it within App Service. While you can setup a SMTP server to send emails on Azure IaaS VMs, we do put in restrictions. We recommend using relay services to send email – ex: Office 365.
  • LDAP Queries – In case you are building internal applications that are querying your LDAP store such as AD, those may not work on Azure App Service. Specifically, in case of Active Directory you can move AD to Azure AD and then use the graph APIs to make the necessary queries to Azure. For this, you will need to register your application with Azure AD to permit querying Directory Objects. Complete list of graph APIs are here.
  • Shared/Linked Database queries - An application that has multiple databases needs to be given additional thought. If there are cross database queries, we will need to move the databases to Elastic Pools and then use Elastic Queries to query across the databases. If there is a linked database inside your database statements then this will not be permitted on Azure. This may require a re-architecture of the solution to avoid linked database queries.
  • SQL Server features -  There are a number of server level features that are not supported on Azure SQL Database. There is a limited preview of SQL Managed Instance, but until then you may have to consider the following
    • SQL Agents – We see a number of databases having SQL Agents that would perform certain scheduled activities. Some functionality of this can be implemented via Elastic Jobs. For the remaining part, you would need to setup Web Jobs and/or Azure Functions.
    • SSIS – You can either go ahead and deploy SSIS on VMs or use Azure Data Factory to load/migrate data.
    • SSRS – You can deploy SSRS on VMs, use customized reporting in your application or use Power BI to generate intuitive reports.
    • SSAS – SSAS is available as a separate service called Azure Analysis Services.

Do note that all of the above solutions will require connecting to the Azure SQL Database over public DNS. You can ensure that the necessary sources are explicitly allowed connections via SQL Firewall.

  • SQL Firewall – This is often overlooked, but you should provide explicit IP that will connect to the Azure SQL Database to enable connectivity. This includes client IPs that will manage using SSMS as well as Azure App Service.

Azure Analysis Services features on Azure Friday

$
0
0

Built on the proven analytics engine in Microsoft SQL Server Analysis Services, Azure Analysis Services delivers enterprise-grade BI semantic modeling capabilities with the scale, flexibility, and management benefits of the cloud. The success of any modern data-driven organization requires that information is available at the fingertips of every business user, not just IT professionals and data scientists, to guide their day-to-day decisions. Azure Analysis Services helps you transform complex data into actionable insights. Users in your organization can then connect to your data models using tools like Excel, Power BI, and many others to create reports and perform ad-hoc interactive analysis.

I joined Scott on Azure Friday to talk about some new features in Azure Analysis Services. Query scale out and diagnostic logging were announced at the SQL PASS Summit 2017 and both lend themselves particularly well to the cloud.

  • Query scale out for Azure Analysis Services allows client queries to be distributed among multiple query replicas in a query pool, reducing response times during high query workloads. You can also separate processing from the query pool, ensuring client queries are not adversely affected by processing operations. If you have ever set up scale out on premises, you might be shocked to know the implementation is basically a simple slider bar in Azure!
  • Diagnostic logging is a key feature of IT-owned BI implementations, allowing various insights for auditing, monitoring of server health, usage statistics, and identification of long-running user queries. It is also much quicker and easier to set up than on premises, leveraging the Azure Diagnostic Logs platform.

 

Learn more about Azure Analysis Services.

Azure Data Lake tools integrates with VSCode Data Lake Explorer and Azure Account

$
0
0

If you are a data scientist and want to explore the data and understand what is being saved and what the hierarchy of the folder is, please try Data Lake Explorer in VSCode ADL Tools. If you are a developer and look for easier navigation inside the ADLS, please use Data Lake Explorer in VSCode ADL Tools. The VSCode Data Lake Explorer enhances your Azure login experiences, empowers you to manage your ADLA metadata in a tree like hierarchical way and enables easier file exploration for ADLS resources under your Azure subscriptions. You can also preview, delete, download, and upload files through contextual menu. With the integration of VSCode explorer, you can choose your preferred way to manage your U-SQL databases and your ADLS storage accounts in addition to the existing ADLA and ADLS commands.

If you have difficulties to login to Azure and look for simpler sign in processes, the Azure Data Lake Tools integration with VSCode Azure account enables auto sign in and greatly enhance the integration with Azure experiences. If you are an Azure multi-tenant user, the integration with Azure account unblocks you and empowers you to navigate your Azure subscription resources across tenants.

If your source code is in GitHub, a new command ADL: Set Git Ignore has been added to auto exclude system generated files and folders from your GitHub source repository.

Key Customer Benefits

  • Support Azure auto sign in and improve sign in experiences via integration with Azure Account extension.
  • Enable multi-tenants support to allow you to manage your Azure subscription resources across tenants.
  • Browse ADLA metadata and view metadata schema while performing U-SQL authoring.
  • Create and delete your U-SQL database objects anytime in a tree like explorer.
  • Navigate across ADLS storage accounts for file exploration, file preview, file download, file/folder delete, and file/folder upload in a tree like explorer.
  • Exclude system generated files and folders from the GitHub repository through command.

Summary of new features

  • Azure Data Lake Analytics integration with Data Lake Explorer

ExplorerADLA

  • Azure Data Lake Storage integration with Data Lake Explorer 

ExplorerADLS

  • Set Git Ignore file

image

image

How to install or update

Install Visual Studio Code and download Mono 4.2.x (for Linux and Mac). Then get the latest Azure Data Lake Tools by going to the VSCode Extension repository or the VSCode Marketplace and searching Azure Data Lake Tools.

Extension_thumb3

For more information about Azure Data Lake Tool for VSCode, please use the following resources:

Learn more about today’s announcements on the Azure blog and the Big Data blog. Discover more on the Azure service updates page.

If you have questions, feedback, comments, or bug reports, please use the comments below or send a note to hdivstool@microsoft.com.

Build richer apps with your time series data

$
0
0

Today, we are pleased to announce the release of new TSI developer tools, including an Azure Resource Manager (ARM) template, API code samples, and easy-to-follow documentation for developers. TSI’s developer tools will shorten the time it takes to get started developing. Using these developer tools, customers can more easily embed TSI’s platform into custom applications to power charts/graphs, compare data from different points in time, and dynamically explore trends and correlations in their data.

As organizations transition their go-to-market and business models from selling devices to selling services, they are developing companion applications that provide operational insights and analytics to their customers.  Much of the data required to power these applications is time series, but large volumes of time series data can be very challenging to store and query. Time Series Insights (TSI) takes the burden of time series data management away from these organizations, and TSI’s platform capabilities enable developers to build applications that provide valuable insights to their customers.

Why time series data is difficult to embed in applications today

Time series data at IoT-scale can lead to high latency and long rendering times when querying traditional databases. Many customers have told us that it’s easy to hang a client application when you need to search across billions of time series events, and they’re right – doing real-time with time series data is tough. Moreover, many of our customers don’t want to hire an army of engineers to manage a backend, they would rather invest their resources into building their core product.

Why embed Time Series Insights into your application

TSI is purpose-built for time series data and is optimized for reads and throughput so that you can query your data on the fly. Applications built on top of TSI can dynamically visualize and query data from all over the world in real-time by taking advantage of built-in capabilities like aggregation and tumbling window interval processing. TSI is an SLA-backed, fully managed service, so there’s no need to hire a team of engineers to manage your data. Additionally, it scales with you, so as your application grows in usage, your database and querying engine do as well.

How Time Series Insights makes it easier for developers

TSI APIs enable fine-grain control over time series data, and they make it easy to plug TSI into a broader workflow or technology stack. The ARM template is beneficial to help manage TSI environments where using the Azure portal doesn’t make sense, like a custom application built on top of TSI. Similarly, the C# and JavaScript clients we are releasing today provide examples for developers to follow so that they can more easily build advanced time series functionality into their applications. These tools dramatically simplify the development process by proving a framework for both programmatically managing Azure resources and automating queries of your data.

We previewed these developer tools with several organizations who expressed interest in using Time Series Insights as a platform. One of the companies, Steelcase,  just released the Steelcase Workplace Advisor using TSI’s developer tools. The Steelcase Workplace Advisor uses time series data to help their customers measure and improve the effectiveness of the workplace in near real-time.

Steelcase1

“We are constantly working to help our customers reimagine how they can empower their workforce to work more efficiently in their workplaces. To do this well, we needed a place to capture and store large volumes of time series data, make calculations with that data on the fly in real-time, and aggregate that data, so it’s easier to view and explore in our application. Time Series Insights APIs have enabled us to provide real-time visibility across workspaces around the globe, giving our customers the ability to intuitively gain insights and make informed decisions on how to optimize the workplace.  We chose to build on TSI because of the speed it allows us to dynamically query our data, its interval chunking, and aggregation capabilities.”

-    Scott King, Software Developer at Steelcase

What we’re releasing today

To help developers build richer time series data experiences, we’re releasing:

  • An Azure Resource Manager (ARM) template that automates managing environments, access policies, event sources and reference data. An ARM template is a JSON file that defines the infrastructure and configuration of resources in a resource group. To perform operations on Azure TSI resources, you send HTTPS requests containing supported parameters with the ARM template. You can also use the PowerShell command line to pass these Azure resources parameters. 
  • C# and JavaScript samples that show a framework used to communicate with the Time Series Insights service. Following these examples enable developers to write applications that call TSI APIs. You can find these samples on our GitHub samples page.
  • Documentation to go along with both, helping developers to understand what they need to do to get started, the process of using ARM and our APIs, and frequently asked questions.

Getting started

You can access the new ARM template in the Azure portal under the automation script blade or on management documentation page. You can find related documentation here.

Below is a screenshot of the automation script blade in a Time Series Insights environment within the Azure portal. You now can easily download your ARM template to programmatically deploy it or add it to your library for later use. 

image

Thanks for building on our API platform. We’re excited to see what you develop on top of Time Series Insights.  We’ll continue to invest in tools that make it easier for developers, but if you have anything specific in mind, don’t hesitate to drop us a line at tsipmteam@microsoft.com.


Network virtualization using SCVMM and TFS/VSTS for your Build-Deploy-Test scenarios

$
0
0
Network Virtualization Network Virtualization provides ability to create multiple virtual networks on a shared physical network. Each tenant gets a complete isolated virtual network, which includes support for virtual subnets and virtual routing. Each tenant can use their own IP addresses and subnets in these virtual networks, even if these conflict with or overlap with... Read More

Designing, building, and operating microservices on Azure

$
0
0

I'm excited to announce that the AzureCAT patterns and practices team has published new guidance about microservices titled Designing, building, and operating microservices on Azure.

Microservices have become a popular architectural style for building cloud applications that are resilient, highly scalable, and able to evolve quickly. To be more than just a buzzword, however, microservices require a different approach to designing and building applications.

In this set of articles, we explore how to build and run a microservices architecture on Azure, using Kubernetes as a container orchestrator. Future articles will include Service Fabric. Topics include:

  • Using Domain Driven Design (DDD) to design a microservices architecture.
  • Choosing the right Azure technologies for compute, storage, messaging, and other elements of the design.
  • Understanding microservices design patterns.
  • Designing for resiliency, scalability, and performance.
  • Building a CI/CD pipeline.

Throughout, we focus on an end-to-end scenario for a drone delivery service that lets customers schedule packages to be picked up and delivered via drone. A reference implementation for this project is available on GitHub.

The reference implementation includes a number of different Azure and open source technologies:

  • Azure Container Service (Kubernetes) to run frontend and backend services.
  • Azure Functions to run event driven services.
  • Linkerd to manage inter-service communication.
  • Prometheus to monitor system/application metrics.
  • Fluentd and Elasticsearch to monitor application logs.
  • Cosmos DB, Azure Data Lake Store, and Azure Redis Cache to store different types of data.
 
The goal of this guidance is to show the end-to-end process of designing, building, and operating microservices under a realistic scenario. We hope you will find it useful in your own projects. As always, we greatly appreciate your feedback.
 

Securing Azure customers from CPU vulnerability

$
0
0

An industry-wide, hardware-based security vulnerability was disclosed today. Keeping customers secure is always our top priority and we are taking active steps to ensure that no Azure customer is exposed to these vulnerabilities. At the time of this blog post, Microsoft has not received any information to indicate that these vulnerabilities have been used to attack Azure customers.

The majority of Azure infrastructure has already been updated to address this vulnerability. Some aspects of Azure are still being updated and require a reboot of customer VMs for the security update to take effect. Many of you have received notification in recent weeks of a planned maintenance on Azure and have already rebooted your VMs to apply the fix, and no further action by you is required.

With the public disclosure of the security vulnerability today, we are accelerating the planned maintenance timing and will begin automatically rebooting the remaining impacted VMs starting at 3:30pm PST on January 3, 2018. The self-service maintenance window that was available for some customers has now ended, in order to begin this accelerated update.

During this update, we will maintain our SLA commitments of Availability Sets, VM Scale Sets, and Cloud Services. This reduces impact to availability and only reboots a subset of your VMs at any given time. This ensures that any solution that follows Azure’s high availability guidance remains available to your customers and users. Operating system and Data disks on your VM will be retained during this maintenance. You can see the status of your VMs and if the reboot completed within the Azure Service Health Planned Maintenance Section in your Azure Portal. 

The majority of Azure customers should not see a noticeable performance impact with this update. We’ve worked to optimize the CPU and disk I/O path and are not seeing noticeable performance impact after the fix has been applied. A small set of customers may experience some networking performance impact. This can be addressed by turning on Azure Accelerated Networking (Windows, Linux), which is a free capability available to all Azure customers. We will continue to monitor performance closely and address customer feedback.

This Azure infrastructure update addresses the disclosed vulnerability at the hypervisor level and does not require an update to your Windows or Linux VM images. However, as always, you should continue to apply security best practices for your VM images.

Make your R code run faster

$
0
0

There are lots of tricks you can use to make R code run faster: use more efficient data structures; vectorize your R code; offload complex data management tasks to databases. Emily Robinson shares many of these R performance tips in a case study on A/B testing for Etsy. The tips are just as valuable as the process Emily shares for evaluating them — and also the process of asking the R community for help. Check out her post, linked below.

Hooked on Data: Making R Code Faster : A Case Study

Geocoding and Routing improvements added to Bing Maps

$
0
0

Geocoding and Routing are core services in mapping platforms. Today, the Bing Maps team is happy to announce improved algorithms and increased coverage support for these services, which also directly improve the coverage of many of the new Bing Maps Fleet Management APIs, such as the Isochrone and Distance Matrix APIs which rely on routing data.

Geocoder Improvements

The Bing Maps team has spent several years developing a new backend geocoder, which provides improved matching logic and the ability to make faster data updates. This new geocoder went into production earlier this year, and the Bing Maps developer APIs were seamlessly updated on the backend, thus instantly migrating all users to the new geocoder without any code changes required by them.

In addition to creating a new backend geocoder, the team has been working hard on increasing the support for geocoding globally. Bing Maps ranks the level of geocoding support for each country using the following criteria:

  • Rooftop – Addresses are resolved to the latitude/longitude coordinate at the center of the address parcel (property boundary). Rooftop has the highest level of accuracy support. Its coverage varies by country.
  • Address – Addresses are interpolated to a latitude/longitude coordinate on the street.
  • Street Name – Addresses are resolved to the latitude/longitude coordinate of the street that contains the address. The address number is not processed.
  • Basic – Geocoding support is limited and primarily only accurate to the city level. If an address is valid, Bing Maps attempts to resolve it, but a result is not guaranteed.

The number of countries in which Bing Maps has Rooftop or Address level detailed coverage has grown from 82 to 109 countries.

Highest Level Support # of Countries
Old Coverage New Coverage
Rooftop 17 46
Address 65 63
Street Name 12 46
Basic 159 98

Detailed geocoding coverage information can be found here.

Routing Service Improvements

Bing Maps provides routing for the following modes of transportation: cars, trucks, walking and public transit. Bing Maps provides both real-time and predictive traffic based routing. The team is constantly tweaking and improving the algorithms and data to provide more efficient and accurate routes. Bing Maps ranks the level of standard routing support for each country using the following criteria:

  • Good - The country/region has detailed road data available in most populated centers and most of these have been verified for accuracy. Coverage is updated frequently. Remote areas may lack some road information.
  • Fair - At a minimum, the country/region has major road data available as well as some detailed road data. Most often, these roads have not been verified for accuracy. Coverage is updated over time. Please visit the map to assess if the current version meets the needs of your application.
  • Major Roads Only - At a minimum, the country/region coverage includes major roads. These roads have not been verified for accuracy. Coverage is updated over time. Please visit the map to assess if the current version meets the needs of your application.

The number of countries in which Bing Maps has good routing coverage has grown from 58 to 116 countries.

  # of Countries
Old Coverage New Coverage
Good 58 116
Fair 82 54
Major Roads Only 113 83

Detailed routing coverage information can be found here.

Truck Routing

Truck routing support was just recently added to Bing Maps and is supported in 80 countries.

Detailed truck routing coverage information can be found here.

If you have any questions or feedback about Bing Maps, please let us know on the Bing Maps forums or visit the Bing Maps website to learn more about the Bing Maps platform.

- Bing Maps Team

Related Posts

Viewing all 10804 articles
Browse latest View live