Quantcast
Channel: Category Name
Viewing all 10804 articles
Browse latest View live

The next generation of Azure Alerts has arrived

$
0
0

Today, we are announcing the general availability of the next generation of alerts in Azure. With Azure Monitoring Services, you can set up alerts to monitor the metrics and log data for the entire stack across your infrastructure, application, and Azure platform. With the release of the next generation alerts, we are providing a new consolidated alerts experience and offering a new alerts platform that will be faster and leveraged by other Azure services. Some of our customers have already been using the new alerts in preview and provided us feedback.

“The new unified experience dramatically improves our alert management capabilities. As part of our standard client configuration, we deploy and manage a variety of resource monitors to provide comprehensive coverage of a customer environment. These monitors include PaaS Resource Metrics/Logs, Azure Activity Events, and Log Analytics searches. We can now manage all of these monitors and alerts through a single interface and layer standardized action groups across all of them. This new service means we can offer a more consistent approach to our customers while dramatically reducing the management overhead. 

In addition, the performance improvements have allowed us to respond quicker to critical customer issues. With near real time metrics for things like Heart Beat logs, we can be alerted to a downed server event in record time and take swift action to recover the environment.”

Dugan Sheehan, Principal – Product Architect, Fanatical Azure, Rackspace

Let’s review the faster alerts and the unified experience.

Faster alerts

Metric alerts: The new alerts platform is designed to provide low latency metric alerts. You can monitor metric values at a frequency as low as 1 minute, and alerts are fired in less than 5 minutes. You can also set alert rules on multi-dimensional metrics opening up the possibility for more complex or precise alert rules. As an example, using multi-dimensional metrics you can set up an alert rule that monitors ingress rates on a specific API such as PutBlob or GetBlob within a storage account.

Log alerts (limited public preview): With the new “Metric alerts for Logs” capability, you can achieve low-latency alerts for certain log types in Log Analytics, including performance counters and heartbeats– which are important for mission-critical infrastructure and applications. As part of this feature, we are deriving metrics from logs (in Log Analytics) and leveraging the new metric alerting platform to provide faster alerts on logs. In addition to Azure resources, this capability allows you to set alert rules for hybrid scenarios. Coupled with the Log Analytics capability to collect data from Azure and on-premises resources, you can leverage “Metric Alerts for Logs” capability to alert on hybrid environment.

Unified experience – a single place to manage all alerts in Azure

CreateAlertRule

Consistent alert experience: We are introducing a completely re-imagined alerts experience in the Azure portal. With the new experience, you get a consistent look and feel to set and manage alerts across monitoring sources for the new metric alerting platform, logs (see below), and Activity Logs. The unified experience also provides a single point of integration with Azure Action Groups. Action Groups enable you to publish alert notifications to your personal device, automation system, or ITSM platform. You may receive notifications on your personal device via email, SMS, Azure app push, or the recently released Voice call action. Furthermore, you can associate the same Action Group with multiple alerts, eliminating the maintenance burden of configuring notifications for each alert.

“We rely heavily on alerts to stay on top of issues and avoid application downtime. Managing hundreds of alerts can be challenging – we are excited about using the new alert management capability which greatly simplifies the management of an alerts while also providing us with faster near-real time metrics and log alerts which is necessary in monitoring the most critical workloads.”

Stanislav Zhelyazkov, Senior Infrastructure Engineer, MVP Microsoft Azure, Sentia Denmark

Support for Log alerts in Azure: The log alerting functionality that was available in the OMS portal is now available in the Azure portal through Azure Monitor! You can now manage alerts and your Azure resources from the Azure portal without having to switch back and forth.  Using this capability, you can write query based alert rules, which provides you the power and flexibility to monitor complex scenarios in your environment. We are removing the restriction of 250 alert rules and also providing a tool for you to access your existing alert rules in OMS portal from the Azure portal.

In addition, you can also create and manage alerts on Application Insights logs (public preview) using the same experience.

The new alert experience is available in the Azure portal under Monitor - Alerts. Previous generation alerting capabilities are now available under Alerts (Classic). If you have to manage multiple alerts, we would recommend taking a look at the new Alerts experience.

Learn more about pricing for these new alerting capabilities. Alerts (Classic), the previous generation alert rules will still be available at no charge.

We’re excited that all the features described in this post are available for you to use today, with metric alerts for logs being in limited public preview. For additional details, refer to the product documentation page. Let us know what you think of the next generation of Azure alerts, and as always, we would love to hear from you to improve our capabilities to help serve your needs better. You can reach us at azurealertsfeedback@microsoft.com.


Visual Studio for Mac version 7.5 Preview 1

$
0
0

Earlier this month, we released version 7.4 of Visual Studio for Mac, our IDE for developers on macOS who are building mobile, web, and cloud apps. Today, we’re announcing the first preview of Visual Studio for Mac version 7.5, which you can get by changing the updater channel in Visual Studio for Mac to use the Beta channel. In this release, the top highlights include:

  • Adding new editor support for Razor, JavaScript, and TypeScript.
  • Improving Azure Functions development with support for the .NET Core Preview SDK and with the introduction of new Azure Functions templates.
  • Adding support for the latest releases of .NET Core and C#, with .NET Core 2.1 Preview and C# version 7.2.
  • Making it easier for Xamarin.Forms developers to build apps using .NET Standard.
  • Continuing to improve IDE performance and stability.

See the full Visual Studio for Mac 7.5 Preview release notes to learn about all the changes that made it into this release.

Web Development with Razor, JavaScript, and TypeScript

Editor support for Razor, JavaScript, and TypeScript have been some of the top web developer requests we have heard. In this release, new editors are being introduced for each of these languages.

With official support for Razor, you now have syntax highlighting and IntelliSense while editing your C# in .cshtml files.

Syntax highlighting and IntelliSense

In previous releases, our JavaScript editor included support for syntax highlighting and colorization. Now, it’s rewritten to provide smarter colorization, IntelliSense, brace completion, and the rest of the core editor experience. At the same time, support is being added for TypeScript editing; providing the same colorization, IntelliSense, and editor experience as JavaScript.

TypeScript editor

Behind the scenes, these three were made possible thanks to a lot of hard work from the Roslyn and Visual Studio JavaScript tooling teams – and Visual Studio for Mac re-uses source code from these editing experiences as they appear in Visual Studio 2017 on PC.

Build serverless solutions with Azure Functions

Last year, we introduced preview support for Azure Functions – enabling development of Azure Functions using C# and .NET with full debugger tooling – based on the Mono runtime. Now, a new Functions template dialog is included, along with support for the .NET Core Preview SDK.

Azure Functions templates

Azure Functions templates enable you to quickly create new functions using the most common triggers and templates. You can access these by creating a new Azure Functions project, right-clicking on your project, and choosing the Add > Add Function… menu.

.NET Core 2.1 and C# 7.2

Visual Studio for Mac version 7.5 Preview 1 is the first release supporting the .NET Core 2.1 Preview SDK. You can read all about the .NET Core 2.1 Preview release in the announcement blog post. Some of the top improvements: faster build performance, closing gaps in ASP.NET Core and EF Core, better compatibility with .NET Framework, General Data Protection Regulation (GDPR) and Security compliance. We’ve also added support for C# 7.2.

.NET Standard and Xamarin.Forms

Mobile developers will be happy to see that .NET Standard Library projects are now a fully supported option for sharing code between platforms when building Xamarin.Forms solutions. This release brings numerous bug fixes to improve the .NET Standard developer experience (see the release notes) and we’ve updated the Xamarin.Forms project templates to all use these library projects by default, instead of Portable Class Library projects.

Improving performance and reliability

Finally, we continue our push to improve performance and reliability in the IDE. This release focuses on improving IDE startup time, which has decreased by as much as 50% for some users. We’re also fixing top issues and crashes as they come into the Developer Community site – please keep the feedback coming! Some of the top fixes in this release:

Feedback

We can’t do this alone and we need your help to make the product better! Please try out the Visual Studio for Mac version 7.5 Preview 1 release by installing Visual Studio for Mac now and switching to the Beta update channel in the IDE. Share your comments and bug reports with us on the Developer Community site; you can get there quickly by using the Help > Report Problem… menu from the IDE. We also welcome feature suggestions on our UserVoice site, which you can also access from the Help > Provide a Suggestion… menu.

Miguel de Icaza, Distinguished Engineer, Mobile Developer Tools

Miguel is a Distinguished Engineer at Microsoft, focused on the mobile platform and creating delightful developer tools. With Nat Friedman, he co-founded both Xamarin in 2011 and Ximian in 1999. Before that, Miguel co-founded the GNOME project in 1997 and has directed the Mono project since its creation in 2001, including multiple Mono releases at Novell. Miguel has received the Free Software Foundation 1999 Free Software Award, the MIT Technology Review Innovator of the Year Award in 1999, and was named one of Time Magazine’s 100 innovators for the new century in September 2000.

TFVC Windows Shell Extension for VSTS and TFS 2018

$
0
0
Team Foundation Version Control (TFVC) users looking for a lightweight version control experience integrated into Windows File Explorer will be happy to see the latest release of the TFVC Windows Shell Extension. This tool provides convenient access to many TFVC commands right in the explorer context menu, and the latest release adds support for Team... Read More

Visual Studio 2017 Version 15.7 Preview 2

$
0
0

Today we released the second preview of the next update: Visual Studio 2017 version 15.7. We hope that you will use this Preview and share your feedback with us. To use the Preview, you can either install it fresh from here, you can update the bits directly from the IDE, or if you have an Azure subscription, you can provision a virtual machine with this latest preview (starting tomorrow).

The top highlights of this Preview include:

  • Improved intellisense for conditional XAML
  • Additional C++ development improvements
  • Streamlined configuration for updating UWP apps
  • Inclusion of TypeScript 2.8
  • Ability to debug JavaScript files using Microsoft Edge
  • Tooling to prevent Web application permission problems
  • Support for building additional project types on the build servers.

This second Preview builds upon the features we added in Preview 1. As always, you can drill into the details of these features by exploring the Visual Studio 2017 version 15.7 Preview release notes. We appreciate your early adoption, engagement, and feedback as it helps us ship the most high-quality tools to everyone in the Visual Studio community.

Productivity

XAML Intellisense: The XAML editor now provides IntelliSense for authoring conditional XAML. When using a type that is not present in the target min version of your app, the XAML editor now not only warns but also provides several options to fix it. The quick fix figures out the right conditional using statement based on the platform version where the type was first introduced, allowing the app to target a wider range of platform versions while being able to consume the latest controls.

XAML IntelliSense

C++ Development

Code Analysis: Five new rules enforcing items from the C++ Core Guidelines regarding use of the Guidelines Support Library are now available.

C++ Standards Conformance: Five more C++17 Standard features are added to the compiler and IntelliSense in this release, making MSVC much closer to be fully compliant with the latest C++ Standard. As an example, extend template argument deduction for functions to constructors of template classes – when you construct a class template you no longer have to specify the arguments. You can now have public base classes in aggregate types, so that they can be initialized using aggregate initialization syntax without writing boilerplate constructors. In braced initializer lists, bases are initialized first, followed by data members.

C++17 Template argument deduction for class
Before After
pair<int, double> p(2, 4.5);

auto t = make_tuple(4, 3, 2.5);

lock_guard<std::mutex> lck(foo.mtx);

pair p(2, 4.5);

tuple t(4, 3, 2.5);

auto lck = lock_guard(foo.mtx);

We are now complete with the full implementation of C++11 Expression SFINAE, and have made the corresponding Standard Library changes. We have also implemented parallel algorithms conforming to the ISO C++17 standard.

For more C++ feature additions, please see the release notes.

Universal Windows Platform Development

Automatic updates for sideloaded UWP apps: The Universal Windows Platform allows distributing applications without the Microsoft Store by using a mechanism called “sideloading”. With Visual Studio 2017 version 15.7 Preview 2 using the latest Windows 10 Insider Preview SDK, there is now tooling to easily configure the automatic update settings for these UWP apps.

UWP Side load Update Settings

TypeScript and JavaScript Development

Compiler: Visual Studio 2017 version 15.7 will include TypeScript 2.8.

Developer Productivity: We’ve continued our push to help make TypeScript and JavaScript developers more productive by adding support for fixing all occurrences of a problem in a document (for example, removing unused variables), organizing imports (including sorting and removing unused declarations), and displaying the lightbulb more proactively when optional improvements are possible. We’ve also fixed some of the top issues raised by customers, including premature triggering of snippets, un-cancellable refactorings, hard-to-disable formatting, and incorrect TypeScript version selection. These improvements are powered by TypeScript 2.8, so for the best experience, we recommend updating your existing projects to use the latest TypeScript version.

Performance: One of the best ways to make developers more productive is to improve the performance of their tools.  To that end, we’ve made background analysis of closed files optional (Only report diagnostics for files opened in the editor under Tools > Options > TextEditor > JavaScript/TypeScript / Project).  We’ve also added support for jsconfig.json – analogous to tsconfig.json – so that JavaScript developers can fine-tune their language service experience in the same way as TypeScript developers.

Debugging using Microsoft Edge: Visual Studio ASP.NET and .NET core developers on Windows Insider builds can now set breakpoints and debug their JavaScript files using Microsoft Edge browser. Visual Studio will use the new Edge DevTools Protocol developed by the Microsoft Edge team when targeting Microsoft Edge browser, which means that developers will be able to debug and fix JavaScript issues from within Visual Studio in both Microsoft Edge and Google Chrome. We are glad to enable this oft requested feature from our customers and would love to hear what you think about it.

Web Development

It is often tricky and time consuming to diagnose runtime application permission problems. In Visual Studio 2017 version 15.7, we’ve made a change to help you identify a specific kind of access issue during development. When running an ASP.NET or ASP.NET Core application on the local machine, the app may not have access to the Key Vault from the account specified under Tools | Options | Azure Service Authentication, and thus won’t be able to run locally. Visual Studio will now detect that case and provide a proactive error during development. This type of diagnostic will shorten the time it takes to discover and fix this local runtime permission problem.

Build Tools

The Visual Studio 2017 Build Tools allow you to create build servers without installing the entire Visual Studio editing environment. Over the past few updates, we’ve been expanding the matrix of project types that the Build Tools support. In the last minor release, we added the ability to build TypeScript and Node.js project types, and in this release we are adding support for building additional project types such as Azure, Office and Sharepoint, Mobile development with .NET (Xamarin), ClickOnce, Docker Tools, Test Tools, and installing into containers. Click here to download the preview release of the Visual Studio Build Tools and try out these new capabilities.

Try out the Preview today!

If you’re not familiar with Visual Studio Previews, take a moment to read the Visual Studio 2017 Release Rhythm. Remember that Visual Studio 2017 Previews can be installed side-by-side with other versions of Visual Studio and other installs of Visual Studio 2017 without adversely affecting either your machine or your productivity. Previews provide an opportunity for you to receive fixes faster and try out upcoming functionality before they become mainstream. Similarly, the Previews enable the Visual Studio engineering team to validate usage, incorporate suggestions, and detect flaws earlier in the development process. We are highly responsive to feedback coming in through the Previews and look forward to hearing from you.

Please get the Visual Studio Preview today, exercise your favorite workloads, and tell us what you think. If you have an Azure subscription, you can provision a virtual machine of this preview (starting tomorrow). You can report issues to us via the Report a Problem tool in Visual Studio or you can share a suggestion on UserVoice. You’ll be able to track your issues in the Visual Studio Developer Community where you can ask questions and find answers. You can also engage with us and other Visual Studio developers through our Visual Studio conversation in the Gitter community (requires GitHub account). Thank you for using the Visual Studio Previews.

Christine Ruana Principal Program Manager, Visual Studio

Christine is on the Visual Studio release engineering team and is responsible for making Visual Studio releases available to our customers around the world.

AI, Machine Learning and Data Science Roundup: March 2018

$
0
0

This is the first edition of a monthly roundup of news about Artificial Intelligence, Machine Learning and Data Science. This is an eclectic collection of interesting blog posts, software announcements, applications and events I've noted over the past month or so.

Open Source AI, ML & Data Science News

Comparative performance benchmarks of deep-learning frameworks (including Tensorflow and PyTorch) on GPU architectures.

Tensorflow 1.6.0 released, now supports CUDA 9.0 and cdDNN 7.

Python 2.0 end-of-life date confirmed as January 1, 2020.

R 3.4.4 released.

Google releases DeepLab on Github, a trained CNN model to assign semantic labels to every pixel of an image.

Google releases Lucid, a neural-network visualization library designed to help with the interpretability of vision systems.

Industry News

The Microsoft Cloud Service Map (PDF) lists AWS services with the equivalent Azure service. The site comparecloud.in does something similar for other cloud services as well.

Analysis of the 2018 Gartner Magic Quadrant for Data Science and Machine Learning Platforms, and SAS's future prospects, from Thomas W Dinsmore.

A NYT feature article on conversational bots: To Give A.I. the Gift of Gab, Silicon Valley Needs to Offend You.

Hackernoon: Train Your Machine Learning Models on Google’s GPUs for Free — Forever.

A Preview of Bristlecone, Google’s New Quantum Processor.

Microsoft News

A podcast interview with Joseph Sirosh on the state of AI and Microsoft's Cloud AI services.

Announcing the AI Platform for Windows Developers, which will allow ONNX models to run natively on Windows-based devices. This blog post and video provides a detailed example.

New milestones for Microsoft AI services: Bing Entity Search and Custom Vision Service (preview) now in Azure Portal; Face API can now recognize up to a million distinct faces.

The Geo AI Data Science VM, including ESRI ArcGIS Pro and a land cover classification tutorial dataset, is now available in the Azure Marketplace.

Microsoft ML Server 9.3 and Microsoft R Client 3.4.3 have been released.

Auto insurer Progressive sells policies with Flo, a chatbot in Facebook Messenger built with Azure Bot Service.

A podcast interview about Project InnerEye, an innovative machine learning tool that helps radiologists identify and analyze 3-D images of cancerous tumors. Details on how to implement a similar application can be found in the blog post, Using Microsoft AI to Build a Lung-Disease Prediction Model Using Chest X-Ray Images.

Using AI to automatically redact faces in video.

Microsoft Research has developed a system to translate news articles from Chinese to English with the same accuracy as human translators.

Learning resources

An overview of LUIS, Microsoft's cloud-based service for developing language- and speech-based applications.

The A-Z of Machine Learning / Deep Learning algorithms: a Twitter thread in 26 parts.

What to Use When: a guide to navigating the Microsoft AI landscape.

Tutorial: using the Data Science Virtual Machine and transfer learning to create a dogs-vs-cats image classifier.

An introduction to Docker for data scientists, from the Microsoft Machine Learning blog.

Using deep learning to generate music: Part 1 and Part 2 on The AI Show on Channel 9.

A new e-book: SQL Server 2017 Machine Learning Services with R.

An 8-step, 5-minute tutorial for setting up a cluster in Azure for use with the sparklyr package for R.

 

ASP.NET Core manageability and Application Insights improvements

$
0
0

There are many great investments on the ASP.NET Core 2.1 roadmap. These investments make ASP.NET Core applications easier to write, host, test, and make security and standards compliant. This blog post talks about areas of investments in manageability and monitoring space. It covers ASP.NET Core, .NET, and Application Insights SDK for ASP.NET Core features and spans beyond 2.1 milestone.

The main themes of manageability improvements across the application stack are:

  1. Distributed tracing
  2. Cross platform features parity
  3. Runtime awareness
  4. Ease of enablement
  5. App framework self-reporting

Let’s dig into the improvements made and the roadmap ahead in these areas.

Distributed tracing

ASP.NET Core 2.0 applications are distributed tracing aware. The context required to track a distributed trace is automatically created or read from incoming HTTP requests and forwarded along with any outgoing out-of-process calls. Collection of distributed trace details does NOT require application code change. No need to register a middleware or install an agent. You can check out the preview of end-to-end trace view as shown on a picture below and more distributed tracing scenarios in Azure Application Insights.

ASP.NET Core 2.0 was shipped with the support of incoming http requests and outgoing HttpClient requests monitoring. Recently support was extended for outgoing calls made via SqlClient for .NET Core, Azure Event Hub, and the Azure Service Bus SDKs. Libraries are instrumented with the DiagnosticSource callbacks. It makes distributed tracing easy to consume by any APM or diagnostics tool. More libraries plan to enable DiagnosticSource support to participate in distributed trace.

Application Insights SDK for ASP.NET Core 2.2.1 was shipped recently. It now automatically collects outgoing calls made using the libraries mentioned above.

We are also working with the community to standardize distributed tracing protocols. Accepted standard enables even wider adoption of distributed tracing. It also simplifies mixing components written in different languages as well as serverless cloud components in a single microservice environment. Our hope is that this standard will be in place for adoption by the next version of ASP.NET Core.

Cross platform features parity

ASP.NET Core applications may target two .NET versions – .NET Framework and .NET Core. They can run on Windows and Linux. Many efforts are directed to bring feature parity between these runtime environments.

There are framework investments for better manageability of ASP.NET Core application across runtime environments. For instance, System.Diagnostics.PerformanceCounter package was recently released. It allows application to collect Performance Counters from .NET Core applications running on Windows. This package was only available for apps compiled for .NET Framework environment before.

Low level manageability interfaces like Profiling API also getting to the feature parity on various runtime platforms.

Recently more Application Insights features were ported from .NET Framework version to .NET Core. Application Insights SDK for ASP.NET Core version 2.2.1 have live metrics support, hardened telemetry channel with more reliable data upload. And adaptive sampling feature to enable better control of telemetry volume and price.

We are excited to announce the public preview for Application Insights Profiler on ASP.NET core Linux web apps. Learn more at documentation page Profile ASP.NET Core Azure Linux Web Apps with Application Insights Profiler.

Runtime awareness

Variety of runtime platforms makes the job of monitoring tools harder. Application Insights SDK needs to be runtime aware. Team makes investments to natively understand platforms like Azure Web Apps or containers run by Kubernetes.

Ability to associate infrastructure telemetry with application insights is important. Correlating container CPU and number of running instances with the request load and reliability of an application allows to get a full picture of application behavior. It allows to find out the root cause of the problem faster and apply remediations curated to the runtime environment.

Ease of enablement

When time comes to manageability and diagnostics – the last thing you want to do is to redeploy an application to enable additional data collection. Especially when application is running in production. There are set of investments teams making to simplify enablement of manageability, monitoring and diagnostics settings.

Snapshot Debugger will be enabled by default for the ASP.NET Core applications running as Azure Web App.

Another aspect of easier onboarding is Application Insights SDK configuration story ironing. Today Application Insights predefine many monitoring settings. Those settings work great for majority of application. However, changing of them is not always easy and intuitive when needed.

ASP.NET Core has many built-in self-reporting capabilities. Exposing them in a form that is easy to consume across runtime platforms is one of the goals of .NET team. There is a proposal to expose many of manageability settings and monitoring data via http callbacks.

App framework self-reporting

ASP.NET framework improves manageability by exposing more internal app metrics. As a result of this discussion these metrics are exposed in a platform-independent way via EventCounters. Metrics exposed via EventCounters are available for in-process and out-of-process consumption.

There is another example of a great improvement made in .NET for better manageability and monitoring. Stack traces became way more readable in .NET 2.1. This blog post outlines few improvements made recently in this area.

Summary

There are many new manageability and monitoring features coming up. Some of them committed, some planned, and some are just proposals. You can help prioritizing features by commenting on GitHub for ASP.NET, Application Insights and .NET. You can also get live updates and participate in the conversation by watching the weekly ASP.NET Community Standup at https://live.asp.net. Your feedback is welcome and appreciated!

Announcing Terraform availability in the Azure Marketplace

$
0
0

In addition to Terraform already being integrated to the Azure Cloud Shell, I’m pleased to announce the availability of the new Terraform solution in the Azure Marketplace. This solution will enable teams to use shared identity, using Managed Service Identity (MSI), and shared state using Azure Storage. These features will allow you to use a consistent hosted instance of Terraform for DevOps Automation and production scenarios.

Azure Marketplace

The Terraform solution configures Terraform to use Azure Storage instead of the local file system for Terraform state. This remote state implementation will lock state when one user is changing it, to allow multiple users to consistently change the state of shared environments, such as production.

The template also configures a Managed Service Identity and provides a Role Based Access Control (RBAC) script that will allow this identity to provision resources in the Azure subscription using Terraform. This eliminates the need for managing Service Principal secrets for Terraform separately in automation scenarios such as continuous deployment with Jenkins.

Azure Terraform Provider updates

Development on the Terraform Azure Provider also continues at a furious pace, we passed the 1.0 milestone last December, and version 1.3 has already shipped. As we near complete coverage of our core infrastructure services such as Virtual Machines, Managed Disk, and Networking among others, you can expect to see further development to support our platform services as well as more advanced deployment scenarios.

Here are just a handful of the most recent additions that enable more advanced scenarios in Azure for Terraform users:

Platform Services

Advanced Networking Services

Microsoft and HashiCorp engineering teams continue to develop the Azure Terraform provider and modules with the Terraform community. If you want to raise feature requests, issues, or even contribute yourself, then please join us in the AzureRM GitHub repository.

Customer adoption

As we continue to enhance Terraform support for Azure, we’re seeing significant adoption of Terraform by Azure customers, in the last year alone we’ve seen an over 4x increase in the number of our customers using Terraform. Axon is one such customer, leveraging Terraform to scale their Evidence.com solution on Azure.

Twitter

In fact, Azure customers across a wide range of industries are leveraging Terraform for similar benefits.

Customer logos

I’m excited about the improvements we’re making for Terraform users on Azure. Go ahead and try the new Terraform Solution in the Azure Marketplace or learn more in the Terraform documentation hub.

Azure Databricks, industry-leading analytics platform powered by Apache Spark™

$
0
0

This blog post was co-authored by Ali Ghodsi, CEO, Databricks.

The confluence of cloud, data, and AI is driving unprecedented change. The ability to utilize data and turn it into breakthrough insights is foundational to innovation today. Our goal is to empower organizations to unleash the power of data and reimagine possibilities that will improve our world.

To enable this journey, we are excited to announce the general availability of Azure Databricks, a fast, easy, and collaborative Apache® Spark™-based analytics platform optimized for Azure.

Fast, easy, and collaborative

Over the past five years, Apache Spark has emerged as the open source standard for advanced analytics, machine learning, and AI on Big Data. With a massive community of over 1,000 contributors and rapid adoption by enterprises, we see Spark’s popularity continue to rise.

Azure Databricks is designed in collaboration with Databricks whose founders started the Spark research project at UC Berkeley, which later became Apache Spark. Our goal with Azure Databricks is to help customers accelerate innovation and simplify the process of building Big Data & AI solutions by combining the best of Databricks and Azure.

To meet this goal, we developed Azure Databricks with three design principles.

First, enhance user productivity in developing Big Data applications and analytics pipelines. Azure Databricks’ interactive notebooks enable data science teams to collaborate using popular languages such as R, Python, Scala, and SQL and create powerful machine learning models by working on all their data, not just a sample data set. Native integration with Azure services further simplifies the creation of end-to-end solutions. These capabilities have enabled companies such as renewables.AI to boost the productivity of their data science teams by over 50 percent.

“Instead of one data scientist writing AI code and being the only person who understands it, everybody uses Azure Databricks to share code and develop together.”

- Andy Cross, Director, renewables.AI

Second, enable our customers to scale globally without limits by working on big data with a fully managed, cloud-native service that automatically scales to meet their needs, without high cost or complexity. Azure Databricks not only provides an optimized Spark platform, which is much faster than vanilla Spark, but it also simplifies the process of building batch and streaming data pipelines and deploying machine learning models at scale. This makes the analytics process faster for customers such as E.ON and Lennox International enabling them to accelerate innovation.

“Every day, we analyze nearly a terabyte of wind turbine data to optimize our data models. Before, that took several hours. With Microsoft Azure Databricks, it takes a few minutes. This opens a whole range of possible new applications.”

-  Sam Julian, Product Owner, Data Services, E.ON

“At Lennox International, we have 1000’s of devices streaming data back into our IoT environment. With Azure Databricks, we moved from 60% accuracy to 94% accuracy on detecting equipment failures. Using Azure Databricks has opened the flood gates to all kinds of new use cases and innovations. In our previous process, 15 devices, which created 2 million records, took 6 hours to process. With Azure Databricks, we are able to process 25,000 devices – 10 billion records – in under 14 minutes.”

- Sunil Bondalapati, Director of Information Technology, Lennox International

Third, ensure that we provide our customers with the enterprise security and compliance they have come to expect from Azure. Azure Databricks protects customer data with enterprise-grade SLAs, simplified security and identity, and role-based access controls with Azure Active Directory integration. As a result, organizations can safeguard their data without compromising productivity of their users.

Azure is the best place for Big Data & AI

We are excited to add Azure Databricks to the Azure portfolio of data services and have taken great care to integrate it with other Azure services to unlock key customers scenarios.

High-performance connectivity to Azure SQL Data Warehouse, a petabyte scale, and elastic cloud data warehouse allows organizations to build Modern Data Warehouses to load and process any type of data at scale for enterprise reporting and visualization with Power BI. It also enables data science teams working in Azure Databricks notebooks to easily access high-value data from the warehouse to develop models.

Integration with Azure IoT Hub, Azure Event Hubs, and Azure HDInsight Kafka clusters enables enterprises to build scalable streaming solutions for real-time analytics scenarios such as recommendation engines, fraud detection, predictive maintenance, and many others.

Integration with Azure Blob Storage, Azure Data Lake Store, Azure SQL Data Warehouse, and Azure Cosmos DB allows organizations to use Azure Databricks to clean, join, and aggregate data no matter where it sits.

We are committed to making Azure the best place for organizations to unlock the insights hidden in their data to accelerate innovation. With Azure Databricks and its native integration with other services, Azure is the one-stop destination to easily unlock powerful new analytics, machine learning, and AI scenarios.

Architecture blog v2

Get started today!

We are excited for you to try Azure Databricks! Get started today and let us know your feedback.


Unlock your data’s potential with Azure SQL Data Warehouse and Azure Databricks

$
0
0

Getting the most out of your data is critical for any business in a competitive environment. Businesses need the ability to get the right data into the right hands at the right time. Azure Databricks and Azure SQL Data Warehouse can help you do just that through a Modern Data Warehouse.

Azure SQL Data Warehouse is an elastic, globally available, cloud data warehouse that leverages Massively Parallel Processing (MPP) to quickly run complex queries across petabytes of data. Azure SQL Data Warehouse provides a familiar interface for your analysts who know SQL and want to drive action in your business.

Azure Databricks combines the best of Databricks and Azure to help customers accelerate innovation with one-click set up, streamlined workflows, and an interactive workspace that enables collaboration between data scientists, data engineers, and business analysts powered by Apache Spark.

With the general availability of the Azure Databricks Service comes built-in support for Azure SQL Data Warehouse. This enables any data scientist or data engineer to have a seamless experience connecting their Azure Databricks Cluster and their Azure SQL Data Warehouse when building advanced ETL (extract, transform, and load data) for Modern Data Warehouse Architectures or accessing relational data for Machine Learning and AI. We are seeing that the combination of these services can provide a truly transformative analytics platform for businesses.

Advanced ETL in the Modern Data Warehouse

In a cloud first ecosystem, data is generated in a wide variety of formats based on the source system. It can be structured, semi-structured, nested, and repeated. To run analytics over this data, many customers are choosing to prepare data outside of their warehouse to leverage new skills and tooling that are emerging in their organization. With Azure Databricks, your data engineers can use the full power of the Databricks Runtime with a variety of language choices to develop ETL processes that can scale as you grow and write the data directly into SQL Data Warehouse. With this data in SQL Data Warehouse, your analysts can use SQL and leading BI tools like Power BI to perform complex, interactive analysis of your organizations most valuable data.

Databricks image

Accessing all your data for Machine Learning and AI

In the Machine Learning and Artificial Intelligence space, data wrangling is one of the most time-consuming aspects of a data scientists job. They need to be able to connect and read massive amounts of data to train their models. With built-in support for Azure SQL Data Warehouse, we have made it not only possible, but easy for data scientists to gain access to petabytes of Data Warehousing data. Now they can spend more time on modeling and less time moving data. This all leads to more efficient use of your data scientists precious time.

We are extremely excited to see how the combination of Azure Databricks and Azure SQL Data Warehouse will transform analytics inside your organization. You can get started today by trying out the ETL into Azure SQL Data Warehouse tutorial.

Next steps

This functionality is generally available today in Databricks Runtime 4.0 and connects to any version of Azure SQL DataWarehouse. To get started:

Azure Event Hubs integration with Apache Spark now generally available

$
0
0

The Event Hubs team is happy to announce the general availability of our integration with Apache Spark. Now, Event Hubs users can use Spark to easily build end-to-end streaming applications. The Event Hubs connector for Spark supports Spark Core, Spark Streaming, and Structured Streaming for Spark 2.1, Spark 2.2, and Spark 2.3.

For users new to Spark, Spark Streaming and Structured Streaming are scalable, fault-tolerant stream processing engines. These processing engines allow users to process huge amounts of data using complex algorithms expressed with high-level functions like map, reduce, join, and window. This data can then be pushed to file systems, databases, or even back to Event Hubs.

Setting up a stream is easy, check it out:

import org.apache.spark.eventhubs._
import org.apache.spark.sql.SparkSession

val eventHubsConf = EventHubsConf("{EVENT HUB CONNECTION STRING FROM AZURE PORTAL}")
   .setStartingPosition(EventPosition.fromEndOfStream)

// Create a stream that reads data from the specified Event Hub.
val spark = SparkSession.builder.appName("SimpleStream").getOrCreate()
val eventHubStream = spark.readStream
   .format("eventhubs")
   .options(eventHubsConf.toMap)
   .load()

It's as easy as that! Once your events are streaming into Spark, you can process them as you wish. Spark provides a variety of processing options, such as graph analysis and machine learning. Our documentation has more details on linking our connector with your project!

The project is open source and available on GitHub. All details and documentation can be found there. Any and all community involvement is welcome, come say hello! If you like what you see, please star the repo to show your support!

Finally, if you have any questions, comments, feedback, please join our gitter chat. Contributors are in the channel to chat and answer questions as they come up, Enjoy the connector!

Next steps

Join Microsoft at the GPU Technology Conference

$
0
0

High-performance computing, artificial intelligence, and visualization GPUs have a wide variety of uses. That’s why Microsoft has partnered with NVIDIA to bring a wide variety of NVIDIA GPUs to Azure. Join us in San Jose next week at NVIDIA’s GPU Technology Conference to learn how Azure customers combine the flexibility and elasticity of the cloud with the capability of NVIDIA’s GPUs.

At Booth 603, Microsoft and partners will have demos of customer use cases and experts on hand to talk about how Azure is the cloud for any GPU workload. We will have demos from our partners at Altair, PipelineFX, and Workspot. In addition, you can learn about work we’ve done in oil & gas, automotive, and artificial intelligence.

Partner and customer sessions in the conference program include:

We also have Microsoft technical sessions as part of the conference program:

As you can see, NVIDIA GPUs are a critical part of the infrastructure that Azure customers rely on to drive innovation. We recently announced the general availability of our NVIDIA Tesla V100-powered virtual machines and an expansion of our other offerings to more global regions. We’re looking forward to talking to you next week. If you’re interested in Azure credits to get your GPU-powered workload started, email Shakil Ahmed.

Get started building .NET web apps in the browser with Blazor

$
0
0

Today we released our first public preview of Blazor, a new experimental .NET web framework using C#/Razor and HTML that runs in the browser with WebAssembly. Blazor enables full stack web development with the stability, consistency, and productivity of .NET. While this release is alpha quality and should not be used in production, the code for this release was written from the ground up with an eye towards building a production quality web UI framework.

In this release we've laid the ground work for the Blazor component model and added other foundational features, like routing, dependency injection, and JavaScript interop. We've also been working on the tooling experience so that you get great IntelliSense and completions in the Razor editor. Other features that have been demonstrated previously in prototype form, like live reload, debugging, and prerendering, have not been implemented yet, but are planned for future preview updates. Even so, there is plenty in this release for folks to start kicking the tires and giving feedback on the current direction. For additional details on what's included in this release and known issue please see the release notes.

Let's get started!

Help & feedback

Your feedback is especially important to us during this experimental phase for Blazor. If you run into issues or have questions while trying out Blazor please let us know!

  • File issues on GitHub for any problems you run into or to make suggestions for improvements.
  • Chat with us and the Blazor community on Gitter if you get stuck or to share how blazor is working for you.

Also, after you've tried out Blazor for a while please let us know what you think by taking our in-product survey. Just click the survey link shown on the app home page when running one of the Blazor project templates:

Blazor survey

Get started

To get setup with Blazor:

  1. Install the .NET Core 2.1 Preview 1 SDK.
  2. Install the latest preview of Visual Studio 2017 (15.7) with the Web development workload.
    • Note: You can install Visual Studio previews side-by-side with an existing Visual Studio installation without impacting your existing development environment.
  3. Install the ASP.NET Core Blazor Language Services extension from the Visual Studio Marketplace.

To create your first project from Visual Studio:

  1. Select File -> New Project -> Web -> ASP.NET Core Web Application
  2. Make sure .NET Core and ASP.NET Core 2.0 are selected at the top.
  3. Pick the Blazor template

    New Blazor app dialog

  4. Press Ctrl-F5 to run the app without the debugger. Running with the debugger (F5) is not supported at this time.

If you're not using Visual Studio you can install the Blazor templates from the command-line:

dotnet new -i Microsoft.AspNetCore.Blazor.Templates
dotnet new blazor -o BlazorApp1
cd BlazorApp1
dotnet run

Congrats! You just ran your first Blazor app!

Building components

When you browse to the app, you'll see that it has three prebuilt pages: a home page, a counter page, and a fetch data page:

Blazor app home page

These three pages are implemented by the three Razor files in the Pages folder: Index.cshtml, Counter.cshtml, FetchData.cshtml. Each of these files implements a Blazor component. The home page only contains static markup, but the counter and fetch data pages contain C# logic that will get compiled and executed client-side in the browser.

The counter page has a button that increments a count each time you press it without a page refresh.

Blazor app home page

Normally this kind of client-side behavior would be handled in JavaScript, but here it's implemented in C# and .NET by the Counter component. Take a look at the implementation of the Counter component in the Counter.cshtml file:

@page "/counter"
<h1>Counter</h1>

<p>Current count: @currentCount</p>

<button @onclick(IncrementCount)>Click me</button>

@functions {
    int currentCount = 0;

    void IncrementCount()
    {
        currentCount++;
    }
}

Each Razor (.cshtml) file defines a Blazor component. A Blazor component is a .NET class that defines a reusable piece of web UI. The UI for the Counter component is defined using normal HTML. Dynamic rendering logic (loops, conditionals, expressions, etc.) can be added using Razor syntax. The HTML markup and rendering logic are converted into a component class at build time. The name of the generated .NET class matches the name of the file.

Members of the component class are defined in a @functions block. In the @functions block you can specify component state (properties, fields) as well as methods for event handling or for defining other component logic. These members can then be used as part of the component's rendering logic and for handling events.

Note: Defining components in a single Razor file is typical, but in a future update you will also be able to define component logic in a code behind file.

Each time an event occurs on a component (like the onclick event in the Counter component), that component regenerates its render tree. Blazor will then compare the new render tree against the previous one and apply any modifications to the browser DOM.

Routing to components

The @page directive at the top of the Counter.cshtml file specifies that this component is a page to which requests can be routed. Specifically, the Counter component will handle requests sent to /counter. Without the @page directive the component would not handle any routed request, but it could still be used by other components.

Routing requests to specific components is handled by the Router component, which is used by the root App component in App.cshtml:

<!--
    Configuring this here is temporary. Later we'll move the app config
    into Program.cs, and it won't be necessary to specify AppAssembly.
-->
<Router AppAssembly=typeof(Program).Assembly />

Using components

Once you define a component it can be used to implement other components. For example, we can add a Counter component to the home page of the app, like this:

@page "/"

<h1>Hello, world!</h1>

Welcome to your new app.

<SurveyPrompt Title="How is Blazor working for you?" />

<Counter />

If you build and run the app again (live reload coming soon!) you should now see a separate instance of the Counter component on the home page.

Blazor home page with counter

Component parameters

Components can also have parameters, which you define using public properties on the component class. Let's update the Counter component to have an IncrementAmount property that defaults to 1, but that we can change to something different.

@page "/counter"
<h1>Counter</h1>

<p>Current count: @currentCount</p>

<button @onclick(IncrementCount)>Click me</button>

@functions {
    int currentCount = 0;

    public int IncrementAmount { get; set; } = 1;

    void IncrementCount()
    {
        currentCount += IncrementAmount;
    }
}

The component parameters can be set as attributes on the component tag. In the home page change the increment amount for the counter to 10.

@page "/"

<h1>Hello, world!</h1>

Welcome to your new app.

<Counter IncrementAmount="10" />

When you build and run the app the counter on the home page now increments by 10, while the counter page still increments by 1.

Blazor count by ten

Layouts

The layout for the app is specified using the @layout directive in _ViewImports.cshtml in the Pages folder.

@layout MainLayout

Layouts in Blazor are also also built as components. In our app the MainLayout component in Shared/MainLayout.cshtml defines the app layout.

@implements ILayoutComponent

<div class='container-fluid'>
    <div class='row'>
        <div class='col-sm-3'>
            <NavMenu />
        </div>
        <div class='col-sm-9'>
            @Body
        </div>
    </div>
</div>

@functions {
    public RenderFragment Body { get; set; }
}

Layout components implement ILayoutComponent. In Razor syntax interfaces can be implemented using the @implements directive. The Body property on the ILayoutComponent interface is used by the layout component to specify where the body content should be rendered. In our app the MainLayout component adds a NavMenu component and then renders the body in the main section of the page.

The NavMenu component is implemented in Shared/NavMenu.cshtml and creates a Bootstrap nav bar. The nav links are generated using the built-in NavLink component, which generates an anchor tag with an active CSS class if the current page matches the specified href.

The root component

The root component for the app is specified in the app's Program.Main entry point defined in Program.cs. This is also where you configure services with the service provider for the app.

class Program
{
    static void Main(string[] args)
    {
        var serviceProvider = new BrowserServiceProvider(configure =>
        {
            // Add any custom services here
        });

        new BrowserRenderer(serviceProvider).AddComponent<App>("app");
    }
}

The DOM element selector argument determines where the root component will get rendered. In our case, the app element in index.html is used.

<!DOCTYPE html>
<html>
<head>
    <meta charset="utf-8" />
    <title>BlazorApp1</title>
    <base href="/" />
    <link href="css/bootstrap/bootstrap.min.css" rel="stylesheet" />
    <link href="css/site.css" rel="stylesheet" />
</head>
<body>
    <app>Loading...</app> <!--Root component goes here-->

    <script src="css/bootstrap/bootstrap-native.min.js"></script>
    <script type="blazor-boot"></script>
</body>
</html>

Bootstrapping the runtime

At build-time the blazor-boot script tag is replaced with a bootstrapping script that handles starting the .NET runtime and executing the app's entry point. You can see the updated script tag using the browser developer tools.

<script src="_framework/blazor.js" main="BlazorApp1.dll" entrypoint="BlazorApp1.Program::Main" references="Microsoft.AspNetCore.Blazor.dll,netstandard.dll,..."></script>

As part of the build Blazor will analyze which code paths are being used from the referenced assemblies and then remove unused assemblies and code.

Dependency injection

Services registered with the app's service provider are available to components via dependency injection. You can inject services into a component using constructor injection or using the @inject directive. The latter is how an HttpClient is injected into the FetchData component.

@page "/fetchdata"
@inject HttpClient Http

The FetchData component uses the injected HttpClient to retrieve some JSON from the server when the component is initialized.

@functions {
    WeatherForecast[] forecasts;

    protected override async Task OnInitAsync()
    {
        forecasts = await Http.GetJsonAsync<WeatherForecast[]>("/sample-data/weather.json");
    }

    class WeatherForecast
    {
        public DateTime Date { get; set; }
        public int TemperatureC { get; set; }
        public int TemperatureF { get; set; }
        public string Summary { get; set; }
    }
}

The data is deserialized into the forecasts C# variable as an array of WeatherForecast objects and then used to render the weather table.

<table class='table'>
    <thead>
        <tr>
            <th>Date</th>
            <th>Temp. (C)</th>
            <th>Temp. (F)</th>
            <th>Summary</th>
        </tr>
    </thead>
    <tbody>
        @foreach (var forecast in forecasts)
        {
            <tr>
                <td>@forecast.Date.ToShortDateString()</td>
                <td>@forecast.TemperatureC</td>
                <td>@forecast.TemperatureF</td>
                <td>@forecast.Summary</td>
            </tr>
        }
    </tbody>
</table>

Hosting with ASP.NET Core

The Blazor template creates a client-side web application that can be used with any back end, but by hosting your Blazor application with ASP.NET Core you can do full stack web development with .NET.

The "Blazor (ASP.NET Core Hosted)" project template creates a solution with three projects for you: 1. a Blazor client project, 2. an ASP.NET Core server project, and 3. a shared .NET Standard class library project.

To host the Blazor app in ASP.NET Core the server project has a reference to the client project. Blazor-specific middleware makes the Blazor client app available to browsers.

public void Configure(IApplicationBuilder app, IHostingEnvironment env)
{
    app.UseResponseCompression();

    if (env.IsDevelopment())
    {
        app.UseDeveloperExceptionPage();
    }

    app.UseMvc(routes =>
    {
        routes.MapRoute(name: "default", template: "{controller}/{action}/{id?}");
    });

    app.UseBlazor<Client.Program>();
}

In the standalone Blazor app the weather forecast data was a static JSON file, but in the hosted project the SampleDataController provides the weather data Using ASP.NET Core.

The shared class library project is referenced by both the Blazor client project and the ASP.NET Core server project, so that code can be shared between the two projects. This is a great place to put your domain types. For example, the WeatherForecast class is used by the Web API in the ASP.NET Core project for serialization and in the Blazor project for deserialization.

Build a todo list

Let's add a new page to our app that implements a simple todo list. Add a Pages/Todo.cshtml file to the project. We don't have a Blazor component item template in Visual Studio quite yet, but you can use the Razor View item template as a substitute. Replace the content of the file with some initial markup:

@page "/todo"

<h1>Todo</h1>

Now add the todo list page to the nav bar. Update Shared/Navbar.cshtml to add a nav link for the todo list.

<li>
    <NavLink href="/todo">
        <span class='glyphicon glyphicon-th-list'></span> Todo
    </NavLink>
</li>

Build and run the app. You should now see the new todo page.

Blazor todo start

Add a class to represent your todo items.

public class TodoItem
{
    public string Title { get; set; }
    public bool IsDone { get; set; }
}

The Todo component will maintain the state of the todo list. In the Todo component add a field for the todos in a @functions block and a foreach loop to render them.

@page "/todo"

<h1>Todo</h1>

<ul>
    @foreach (var todo in todos)
    {
        <li>@todo.Title</li>
    }
</ul>

@functions {
    IList<TodoItem> todos = new List<TodoItem>();
}

Now we need some UI for adding todos to the list. Add a text input and a button at the bottom of the list.

@page "/todo"

<h1>Todo</h1>

<ul>
    @foreach (var todo in todos)
    {
        <li>@todo.Title</li>
    }
</ul>

<input placeholder="Something todo" />
<button>Add todo</button>

@functions {
    IList<TodoItem> todos = new List<TodoItem>();
}

Nothing happens yet when we click the "Add todo" button, because we haven't wired up an event handler. Add an AddTodo method to the Todo component and register it for button click using the @onclick attribute.

@page "/todo"

<h1>Todo</h1>

<ul>
    @foreach (var todo in todos)
    {
        <li>@todo.Title</li>
    }
</ul>

<input placeholder="Something todo" />
<button @onclick(AddTodo)>Add todo</button>

@functions {
    IList<TodoItem> todos = new List<TodoItem>();

    void AddTodo()
    {
        // Todo: Add the todo
    }
}

The AddTodo method will now get called every time the button is clicked. To get the title of the new todo item add a newTodo string field and bind it to the value of the text input using the @bind attribute. This sets up a two-way bind. We can now update the AddTodo method to add the TodoItem with the specified title to the list. Don't forget to clear the value of the text input by setting newTodo to an empty string.

@page "/todo"

<h1>Todo</h1>

<ul>
    @foreach (var todo in todos)
    {
        <li>@todo.Title</li>
    }
</ul>

<input placeholder="Something todo" @bind(newTodo) />
<button @onclick(AddTodo)>Add todo</button>

@functions {
    IList<TodoItem> todos = new List<TodoItem>();
    string newTodo;

    void AddTodo()
    {
        if (!String.IsNullOrWhiteSpace(newTodo))
        {
            todos.Add(new TodoItem { Title = newTodo });
            newTodo = "";
        }
    }
}

Lastly, what's a todo list without check boxes? And while we're at it we can make the title text for each todo item editable as well. Add a checkbox input and text input for each todo item and bind their values to the Title and IsDone properties respectively. To verify that these values are being bound, add a count in the header that shows the number of todo items that are not yet done.

@page "/todo"

<h1>Todo (@todos.Where(todo => !todo.IsDone).Count())</h1>

<ul>
    @foreach (var todo in todos)
    {
        <li>
            <input type="checkbox" @bind(todo.IsDone) />
            <input @bind(todo.Title) />
        </li>
    }
</ul>

<input placeholder="Something todo" @bind(newTodo) />
<button @onclick(AddTodo)>Add todo</button>

@functions {
    IList<TodoItem> todos = new List<TodoItem>();
    string newTodo;

    void AddTodo()
    {
        if (!String.IsNullOrWhiteSpace(newTodo))
        {
            todos.Add(new TodoItem { Title = newTodo });
            newTodo = "";
        }
    }
}

Build and run the app and try adding some todo items.

Finished Blazor todo list

Publishing and deployment

Now let's publish our Blazor app to Azure. Right click on the project and select Publish (if you're using the ASP.NET Core hosted template make sure you publish the server project). In the "Pick a publish target" dialog select "App Service" and "Create New".

Pick a publish target

In the "Create App Service" dialog pick a name for the application and select the subscription, resource group, and hosting plan that you want to use. Tap "Create" to create the app service and publish the app.

Create app service

Your app should now be running in Azure.

Blazor on Azure

You can mark that todo item to build your first Blazor app as done! Nice job!

For a more involved Blazor sample app check out the Flight Finder sample on GitHub.

Blazor Flight Finder

Summary

This is the first preview release of Blazor. Already you can get started building component-based web apps that run in the browser with .NET. We invite you to try out Blazor today and let us know what you think about the experience so far. You can file issues on Github, take our in-product survey, and chat with us on Gitter. But this is just the beginning! We have many new features and improvements planned for future updates. We hope you enjoy trying out this initial release and we look forward to sharing new improvements with you in the near future.

Automatic Unit Testing in .NET Core plus Code Coverage in Visual Studio Code

$
0
0

I was talking to Toni Edward Solarin on Skype yesterday about his open source spike (early days) of Code Coverage for .NET Core called "coverlet." There's a few options out there for cobbling together .NET Core Code Coverage but I wanted to see if I could use the lightest tools I could find and make a "complete" solution for Visual Studio Code that would work for .NET Core cross platform. I put my own living spike of a project up on GitHub.

Now, keeping in mine that Toni's project is just getting started and (as of the time of this writing) is missing branching, this is still a VERY compelling developer experience.

Using VS Code, Coverlet, xUnit, plus these Visual Studio Code extensions

Here's what we came up with.

Auto testing, code coverage, line coloring, test explorers, all in VS Code

There's a lot going on here but take a moment and absorb the screenshot of VS Code above.

  • Our test project is using xunit and the xunit runner that integrates with .NET Core as expected.
    • That means we can just "dotnet test" and it'll build and run tests.
  • Added coverlet, which integrates with MSBuild and automatically runs when you "dotnet test" if you "dotnet test /p:CollectCoverage=true"
    • (I think this should command line switch should be more like --coverage" but there may be an MSBuild limitation here.)

I'm interested in "The Developer's Inner Loop." . That means I want to have my tests open, my code open, and as I'm typing I want the solution to build, run tests, and update code coverage automatically the way Visual Studio proper does auto-testing, but in a more Rube Goldbergian way. We're close with this setup, although it's a little slow.

Coverlet can product opencover, lcov, or json files as a resulting output file. You can then generate detailed reports from this. There is a language agnostic VS Code Extension called Coverage Gutters that can read in lcov files and others and highlight line gutters with red, yellow, green to show test coverage. Those lcov files look like this, showing file names, file numbers, coverage, and number of exceptions.

SF:C:githubhanselminutes-corehanselminutes.coreConstants.cs
DA:3,0
end_of_record
SF:C:githubhanselminutes-corehanselminutes.coreMarkdownTagHelper.cs
DA:21,5
DA:23,5
DA:49,5

I should be able to pick the coverage file manually with the extension, but due to a small bug, it's easier to just tell Coverlet to generate a specific file name in a specific format.

dotnet test /p:CollectCoverage=true /p:CoverletOutputFormat=lcov /p:CoverletOutput=./lcov.info .my.tests

The lcov.info files then watched by the VSCode Coverage Gutters extension and updates as the file changes if you click watch in the VS Code Status Bar.

You can take it even further if you add "dotnet watch test" which will compile and re-run tests if code changes:

dotnet watch --project .my.tests test /p:CollectCoverage=true /p:CoverletOutputFormat=lcov /p:CoverletOutput=./lcov.info 

I can run "WatchTests.cmd" in another terminal, or within the VS Code integrated terminal.

tests automatically running as code changes

NOTE: If you're doing code coverage you'll want to ensure your tests and tested assembly are NOT the same file. You might be able to get it to work but it's easier to keep things separate.

Next, add in the totally under appreciated .NET Core Test Explorer extension (this should have hundreds of thousands of downloads - it's criminal) to get this nice Test Explorer pane:

A Test Explorer tree view in VS Code for NET Core projects

Even better, .NET Test Explorer lights up some "code lens" style interfaces over each test as well as a green checkmark for passing tests. Having "debug test" available for .NET Core is an absolute joy.

Check out "run test" and "debug test"

Finally we make some specific improvements to the .vscode/tasks.json file that drives much of VS Code's experience with our app. The "BUILD" label is standard but note both the custom "test" and "testwithcoverage" labels, as well as the added group with kind: "test."

{
    "version": "2.0.0",
    "tasks": [
        {
            "label": "build",
            "command": "dotnet",
            "type": "process",
            "args": [
                "build",
                "${workspaceFolder}/hanselminutes.core.tests/hanselminutes.core.tests.csproj"
            ],
            "problemMatcher": "$msCompile",
            "group": {
                "kind": "build",
                "isDefault": true
            }
        },
        {
            "label": "test",
            "command": "dotnet",
            "type": "process",
            "args": [
                "test",
                "${workspaceFolder}/hanselminutes.core.tests/hanselminutes.core.tests.csproj"
            ],
            "problemMatcher": "$msCompile",
            "group": {
                "kind": "test",
                "isDefault": true
            }
        },
        {
            "label": "test with coverage",
            "command": "dotnet",
            "type": "process",
            "args": [
                "test",
                "/p:CollectCoverage=true",
                "/p:CoverletOutputFormat=lcov",
                "/p:CoverletOutput=./lcov.info",
                "${workspaceFolder}/hanselminutes.core.tests/hanselminutes.core.tests.csproj"
            ],
            "problemMatcher": "$msCompile",
            "group": {
                "kind": "test",
                "isDefault": true
            }
        },
    ]
}

This lets VS Code know what's for building and what's for testing, so if I use the Command Palette to "Run Test" then I'll get this dropdown that lets me run tests and/or update coverage manually if I don't want the autowatch stuff going.

Test or Test with Coverage

Again, all this is just getting started but I've applied it to my Podcast Site that I'm currently rewriting and the experience is very smooth!

Here's a call to action for you! Toni is just getting started on Coverlet and I'm sure he'd love some help. Head over to the Coverlet github and don't just file issues and complain! This is an opportunity for you to get to know the deep internals of .NET and create something cool for the larger community.

What are your thoughts?


Sponsor: Get the latest JetBrains Rider for debugging third-party .NET code, Smart Step Into, more debugger improvements, C# Interactive, new project wizard, and formatting code in columns.



© 2018 Scott Hanselman. All rights reserved.
     

The most prolific package maintainers on CRAN

$
0
0

During a discussion with some other members of the R Consortium, the question came up: who maintains the most packages on CRAN? DataCamp maintains a list of most active maintainers by downloads, but in this case we were interested in the total number of packages by maintainer. Fortunately, this is pretty easy to figure thanks to the CRAN repository tools now included in R, and a little dplyr (see the code below) gives the answer quickly[*].

And the answer? The most prolific maintainer is Scott Chamberlain from ROpenSci, who is currently the maintainer of 77 packages. Here's a list of the top 20:

                 Maint     n
 1  Scott Chamberlain     77
 2  Dirk Eddelbuettel     53
 3        Jeroen Ooms     40
 4     Hadley Wickham     39
 5       Gábor Csárdi     37
 6           ORPHANED     37
 7   Thomas J. Leeper     29
 8          Bob Rudis     28
 9   Henrik Bengtsson     28
10        Kurt Hornik     28
11       Oliver Keyes     28
12    Martin Maechler     27
13     Richard Cotton     27
14 Robin K. S. Hankin     24
15      Simon Urbanek     24
16      Kirill Müller     23
17    Torsten Hothorn     23
18      Achim Zeileis     22
19       Paul Gilbert     21
20          Yihui Xie     21

(That list of orphaned packages with no current maintainer includes XML, d3heatmap, and  flexclust, to name just 3 of the 37.) Here's the R code used to calculate the top 20:

[*]Well, it would have been quick, until I noticed that some maintainers had two forms of their name in the database, one with surrounding quotes and one without. It seemed like it was going to be trivial to fix with a regular expression, but it took me longer than I hoped to come up with the final regexp on line 4 above, which is now barely distinguishable from line noise. As usual, there an xkcd for this situation:

Perl_problems

 

C++ Core Checks in Visual Studio 2017 15.7 Preview 2

$
0
0

This post was written by Sergiy Oryekhov.

The C++ Core Guidelines Check extension received several new rules in Visual Studio 2017 15.7 Preview 2. The primary focus in this iteration was on the checks that would make it easier to adopt utilities from the Guidelines Support Library.

Below is a quick summary of these additions. For more detailed information please see documentation on MSDN: C++ Core Guidelines Checker Reference.

If you’re just getting started with native code analysis tools, take a look at our introductory Quick Start: Code Analysis for C/C++.

New rule sets

There is one new rule category added in this release with a rule set file which can be selected in the project settings dialog.

  • GSL rules

    Several new rules try to help catching subtle issues related to how span and view types are used. Catching such issues becomes more important once modern practices of safe memory handling get adopted. In addition, a couple useful utilities from the Guidelines Support Library are promoted so that user code can be made safer and more uniform.

New rules

Bounds rules

  • C26446 USE_GSL_AT
    This rule suggests using a safer version of an indexing function whenever a subscript operator which doesn’t perform range checks gets called.
    This rule is also a part of the GSL rule set.

Function rules

  • C26447 DONT_THROW_IN_NOEXCEPT
    This check is applicable to functions marked as ‘noexcept’ and points to the places where such functions invoke code that can potentially throw exceptions.

GSL rules

  • C26445 NO_SPAN_REF
    Some subtle issues may occur when legacy code gets migrated to new types that point to memory buffers. One of such issue is unintended referencing of spans or views which may happen if legacy code used to reference containers like vectors or strings etc.
  • C26448 USE_GSL_FINALLY
    The Guidelines Support Library has a useful utility which helps to add the “final action” functionality in a structural and uniform way. This rule helps to find places that use ‘goto’ and may be good candidates for adoption of gsl::finally.
  • C26449 NO_SPAN_FROM_TEMPORARY
    Spans and views are efficient and safe in dealing with memory buffers, but they never own data they point to. This must be taken into account especially when dealing with legacy code. One dangerous mistake that users can make is to create a span over a temporary object. It is a special case of a lifetime issue similar to referencing local data.

    This rule is enabled by default in the “Native Recommended” rule set.

    Example: Subtle difference in result types.

    // Returns a predefined collection. Keeps data alive.
    gsl::span get_seed_sequence() noexcept;
    
    // Returns a generated collection. Doesn’t own new data.
    const std::vector get_next_sequence(gsl::span);
    
    void run_batch()
    {
    auto sequence = get_seed_sequence();
    while (send(sequence))
    {
    sequence = get_next_sequence(sequence); // C26449
    // ...
    }
    }
    
  • C26446 USE_GSL_AT
    See “Bounds rules” above.

Feedback

As always, we welcome your feedback. Feel free to send any comments through e-mail at visualcpp@microsoft.com, through Twitter @visualc, or Facebook at Microsoft Visual Cpp.

If you encounter other problems with MSVC in VS 2017, please let us know via the Report a Problem option, either from the installer or the Visual Studio IDE itself. For suggestions, let us know through UserVoice. Thank you!


Text Buffer Reimplementation, a Visual Studio Code Story

Azure DNS Private Zones now available in public preview

$
0
0

We are pleased to announce the public preview of DNS Private Zones in all Azure Public cloud regions. This capability provides secure and reliable name resolution for your virtual networks in Azure. Private Zones was announced as a managed preview in fall of last year.

 image

No more custom DNS server burden

Private Zones obviates the need to setup and manage custom DNS servers. You can bring DNS zones to your virtual network as you lift-and-shift applications to the Azure cloud, or if you are building Cloud-Native applications. You also have the flexibility to use custom domain names, such as your company’s domain name.

Name resolution across virtual networks and across regions

Private zones provide name resolution both within a virtual network and across virtual networks. You can have private zones not only span across virtual networks in the same region, but also across regions and subscriptions. This feature is available in all Azure Public cloud regions.

Split-horizon support

You can configure zones with a split-horizon view, allowing for a private and a public DNS zone to share the same name. This is a common scenario when you want to validate your workloads in a local test environment, before rolling out in production for broader consumption. To realize this scenario, simply configure the same DNS zone as both a public zone and private zone in Azure DNS. Now for clients in a virtual network attached to the zone, Azure will return the DNS response from the private zone, and for clients on the internet, Azure will return the DNS response from the public zone. Since name resolution is confined to configured virtual networks, you can prevent DNS exfiltration.

image

Dynamic DNS Registration

We are introducing two concepts to DNS zones with this update; Registration virtual networks and Resolution virtual networks. When you designate a virtual network as a Registration virtual network at the time of creating a private zone or later when you update the zone, Azure will dynamically register DNS A records in the private zone for the virtual machines within this virtual network and will keep track of virtual machine additions or removals within the virtual network to keep your private zone updated. This is without any work on your part.

You can also designate up to 10 virtual networks as Resolution virtual networks when creating or updating a private zone. Forward DNS queries will resolve against the private zone records from any of these virtual networks. There is no dependency or requirement that the virtual networks be peered for DNS resolution to work across virtual networks.
Azure DNS Private Zones also supports Reverse DNS queries for the private IP address space of the Registration virtual network.

Familiar Zone and record management

Private zone and record management is done using the same Azure DNS REST APIs, SDKs, PowerShell and CLI as for regular (Public) DNS zones. Portal support will soon follow.

We can’t wait for you to try out this capability! For more details please refer to our overview as well as some common scenarios that can be realized using this feature. You can also refer to our documentation on creating and managing private zones using PowerShell and Azure CLI. Please review our public FAQ for answers to the most frequent questions related to this feature.

As always, we love getting your feedback. You can submit your suggestions and feedback for this feature as well as future scenarios on our user voice channel. And stay tuned for more interesting updates in this space from Azure DNS!

Calling all Desktop Developers: how should UI development be improved?

$
0
0

The user interface (UI) of any application is critical in making your app convenient and efficient for the folks using it. When developing applications for Enterprise use, a good UI can shave time off an entire company’s workflow. Visual Studio is investing in new tools to improve the productivity of Windows desktop developers and we’d love your help to make sure the improvements we make are the right ones.

Please fill out our Desktop Developer Survey. It takes 1-2 mins to fill out. It will give us a sense of the kind of applications you build. We will be reaching out to people that respond so that you can help us understand the challenges YOU are dealing with right now. We will use this information to improve Visual Studio to make it easier to make great desktop UI applications.

Take the survey now!

We appreciate your contribution! Thanks!

Top 5 Windows and Microsoft Store Trends Every Developer Should Know

$
0
0

We are excited to share the latest version of the Windows and Microsoft Store trends page – a trusted source for hardware and Microsoft Store aggregated population data. This data provides you with insights to help you make development and business decisions as you build and update your Windows apps. Below are some highlights from the data. We encourage you to visit the trends page to draw your own insights.

Windows 10 adoption

Windows 10 is now on more than 600 million monthly active devices, ranging from Xbox One consoles, tablets, laptops, Windows Mixed Reality headsets and more. We’ve also introduced Windows 10 with S mode, which delivers users predictable performance and quality through apps available in the Microsoft Store.

With Windows 10 adoption growing and the introduction of S mode, there’s never been a better time to build for Windows and distribute via the Microsoft Store.

Growth of Windows 10 adoption from February 2017 to February 2018.

Top 5 trends

  • There are more than 600M monthly active devices using Windows 10.
  • For apps, the “Entertainment” category is the leader in overall percentage of downloads and revenue worldwide.
  • For games, the “Action and adventure” category is the front-runner for overall percentage of revenue and downloads worldwide.
  • For apps, nearly 2/3 of the overall revenue mix on Microsoft Store comes from download-to-own (DTO) products.
  • For games, nearly 2/3 of the overall revenue mix on Microsoft Store comes from add-ons or in-app purchases (IAP).

Graphs showing app downloads and game downloads by category in the Windows Store.Further analysis

Visit the Windows and Microsoft Store trends page to draw your own insights as you look to modernize and update your existing Windows apps or build new Windows apps.

The post Top 5 Windows and Microsoft Store Trends Every Developer Should Know appeared first on Windows Developer Blog.

Top stories from the VSTS community – 2018.03.23

$
0
0
Here are top stories we found in our streams this week related to DevOps, VSTS, TFS and other interesting topics. Top Stories How Microsoft Does DevOps – Dave Harrison An excellent write-up of an interview by Dave Harrison of Aaron Bjork with links to Aaron’s excellent Agile at Microsoft talk and other content about how Microsoft does... Read More
Viewing all 10804 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>