Quantcast
Channel: Category Name
Viewing all 10804 articles
Browse latest View live

New multi-tenant patterns for building SaaS applications on SQL Database

$
0
0

We’re delighted to announce the availability of an expanded set of sample SaaS applications, each using a different database tenancy model on SQL Database. Each sample includes a series of management scripts and tutorials to help you jump start your own SaaS app project. These samples demonstrate a range of SaaS-focused designs and management patterns that can accelerate SaaS application development on SQL Database.

This is an expansion of the sample Wingtip SaaS application launched earlier this year.

SQL Database SaaS app patterns

The same Wingtip Tickets application is implemented in each of the samples. The app is a simple event listing and ticketing SaaS app, where each venue is a tenant with events, ticket prices, customers, and ticket sales. The app, together with the management scripts and tutorials, showcases an end-to-end SaaS scenario. This includes provisioning tenants, monitoring and managing performance, schema management, and cross-tenant reporting and analytics, all at scale.

The three samples differ in the underlying database tenancy model used. The first uses a single-tenant application with an isolated single-tenant database. The second uses a multi-tenant app, with a database per tenant. The third sample uses a multi-tenant app with sharded multi-tenant databases.

Different versions of Wingtip tickets SaaS application

Standalone application

This sample uses a single tenant application with a single tenant database. The sample can be deployed for multiple tenants. Each tenant’s app is deployed into a separate Azure resource group. This could be in the service provider’s subscription or the tenant’s subscription and managed by the provider on the tenant’s behalf. This pattern provides the greatest tenant isolation, but it is typically the most expensive as there is no opportunity to share resources across multiple tenants.

If you are interested in this SaaS pattern, check out the tutorials and code on GitHub.

Database-per-tenant

The database per tenant model is effective for service providers that are concerned with tenant isolation and want to run a centralized service that allows cost-efficient use of shared resources. A database is created for each venue, or tenant, and all the databases are centrally managed. They can be hosted in elastic pools to provide cost-efficient and easy performance management, which leverages the unpredictable usage patterns of the tenants. A catalog database holds the mapping between tenants and their databases. This mapping is managed using the shard map management features of the Elastic Database Client Library, which also provides efficient connection management to the application.

If you are interested in this SaaS pattern, check out the tutorials and code on GitHub.

Sharded multi-tenant database

Multi-tenant databases are effective for service providers looking for lower cost and simpler management and are okay with reduced tenant isolation. This model allows packing large numbers of tenants into a single database, driving the cost-per-tenant down. This model works well where only a small amount of data storage is required per tenant. Further flexibility is available in this model, allowing you to optimize for cost with multiple tenants in the same database, or optimize for isolation with a single tenant in a database. The choice can be made on a tenant-by-tenant basis, either when the tenant is provisioned or later, with no impact on the design of the application.

If you are interested in this SaaS pattern, check out the tutorials and code on GitHub.

Get started

Learn more about the SaaS app patterns described above. Get started with one of these SaaS app patterns by checking out the tutorials, where you will see instructions on deploying and managing the app. Let us know at saasfeedback@microsoft.com what you think of the samples and patterns, and what you’d like to see added next.


Side-by-side minor version MSVC toolsets in Visual Studio 2017

$
0
0

We’ve been delivering improvements to Visual Studio 2017 more frequently than ever before. Since its first release in March we’ve released four major updates to VS2017 and are currently previewing the fifth update, VS2017 version 15.5.

The MSVC toolset in VS2017 is built as a minor version update to the VS2015 compiler toolset. This minor version bump indicates that the VS2017 MSVC toolset is binary compatible with the VS2015 MSVC toolset, enabling an easier upgrade for VS2015 users. Even though the MSVC compiler toolset in VS2017 delivers many new features and conformance improvements it is a minor version, compatible update from 14.00 in VS2015 to 14.10 in VS2017.

We’ve made significant updates to the MSVC toolset twice so far in VS2017: once with the first release of VS2017 and again in update version 15.3. We’re making another significant update with VS2017 version 15.5. The MSVC toolsets in 15.1, 15.2, and 15.4 were incremental, bug fix-level updates. For reference, here are the MSVC toolset versions and compiler versions (__MSC_VER) in each release of VS2015 to VS2017. (Note that for historical reasons the MSVC compiler version is 5 higher than the MSVC toolset version displayed in Visual Studio.)

Visual Studio Version MSVC Toolset Version MSVC compiler version (__MSC_VER)
VS2015 and updates 1, 2, & 3 v140 in VS; version 14.00 1900
VS2017, version 15.1 & 15.2 v141 in VS; version 14.10 1910
VS2017, version 15.3 & 15.4 v141 in VS; version 14.11 1911
VS2017, version 15.5 v141 in VS; version 14.12 1912

Minor version updates are designed to be source and binary compatible with previous versions. We test all source changes for compatibility and document all C++ conformance changes and any required source changes with every release. The source changes that we document are almost always forward and backward compatible, meaning that you can compile the code with either a new or old compiler.

Sometimes, despite best efforts, adding new functionality can introduce bugs that affect your code. If you do encounter bugs with your source base, or you need extra time to apply source fixes in your codebase, you may need a way to get back to the previous VS2017 update’s toolset while you update your code or we fix bugs. You can now install the previous minor version MSVC toolset (14.11) if you run into issues with the MSVC compiler with VS2017 version 15.5 preview 4 (14.12).

Who should use this feature?

This feature is intended as an “escape hatch” for developers who find that there is a bug, either in their source code or in MSVC, that cannot be easily worked around or fixed in a timely fashion. If there’s a conformance issue in your source code, the best option is to apply the proper fixes to make your code conforming if possible (sometimes there are too many required changes in your code to fix it all immediately.) If you believe that there’s a bug in MSVC, it’s best to talk with us so that we can fix the bug or supply a workaround.

This feature is useful in the unlikely event that we can’t deliver a bug fix quickly enough and you cannot apply a workaround in your source code. It’s not meant to be a general-purpose feature. If you run into an issue you should first contact us (see Contact us! below) so we can try to resolve your problem directly.

What alternatives do I have?

If you suspect that you will need to stay on a particular VS2017 update—say your product is ready to ship the day after VS2017 updates—you might consider preserving an offline installation of the VS2017 version that builds the current version of your product. Preserving an offline installation will allow you to install an older version of VS2017 after it’s been updated. You can find more information on this page, Creating an offline installation of Visual Studio 2017.

Older versions of Visual Studio and the MSVC toolset may not be supported

You should note that older versions of Visual Studio and the MSVC toolset follow the standard Visual Studio servicing guidelines. These guidelines specifically advise that only the RTW version and latest versions are supported. Please review the VS servicing guidelines for yourself before using either the side-by-side minor version MSVC toolsets or an offline installation of an older VS2017 version.

How to install side-by-side MSVC toolsets

If the MSVC team has advised a side-by-side toolset as your best option to work around a compiler bug or source issue, how do you install it? It’s actually an option in the VS2017 installer. Just select the “Individual Components” tab at the top of the installer screen and scroll down to the “Compilers, build tools, and runtimes” section.  The 14.11 toolset is included in the VS2017 version 15.5 update 4 installer.

Individual components in VS setup
Selecting the “VC++ 2017 version 15.4 v14.11 toolset” will also select the current MSVC toolset for VC++ 2017 version 15.5. Projects will by default use the current MSVC toolset–you’ll have to exit your project file (.vcxproj) to use the older toolset.

Using a side-by-side minor version MSVC toolset in VS

Side-by-side minor version MSVC toolsets don’t appear in the “Platform Toolset” options of the Project Configuration Properties. To enable them you need to edit the .vcxproj file for your project. Each side-by-side minor version MSVC toolset includes a .props file that can be included in your project’s .vcxproj file.

Before you start, you should add the -Bv compiler option as an Additional Option on the compiler command line. This will show the verbose compiler version information in the build Output box. Just enter “-Bv” in the Project Properties > C/C++ > Command Line edit box.

Adding -Bv to the Additonal Options in Project Properties

Now, open the VCAuxiliaryBuild14.11 directory in the folder where you installed VS2017 version 15.5 Preview 4. For example, using the default install location you’ll find it here: C:Program Files (x86)Microsoft Visual StudioPreviewEnterpriseVCAuxiliaryBuild14.11. You should see three files in this folder. You’ll need to copy one of them, Microsoft.VCToolsVersion.14.11.props, into your solution directory.

Side-by-Side Toolset Directory Listing

Next, open the folder containing your solution by right-clicking on the solution and selecting “Open Folder in File Explorer”.

Open Folder from Solution Explorer

Copy the Microsoft.VCToolsVersion.14.11.props file from the VS2017 version 15.5 Preview 4 folder into your solution directory. The file should sit in the same directory as your project’s solution file, e.g., Project6.sln.

Copy file into Solution directory

Now unload your project by right-clicking on the project and selecting “Unload Project”.

Unload Project in Soution Explorer

Once the project is unloaded, you can edit the project by clicking on it and selecting “Edit [ProjectName]”.

Edit Project File

Locate the line that says

<Import Project="$(VCTargetsPath)Microsoft.Cpp.Default.props" />

Add a line directly above this line that imports the Microsoft.VCToolsVersion.14.11.props that you just copied into the solution directory:

  <Import Project="$(SolutionDir)Microsoft.VCToolsVersion.14.11.props" />
  <Import Project="$(VCTargetsPath)Microsoft.Cpp.Default.props" />

Now, save the file, then right-click on the project name and select “Reload Project”.

Reload Project in Solution Explorer

If you haven’t already saved the file, you’ll be prompted to close the open .vcxproj file. Select “Yes” to close the file.

Now when you rebuild the solution you’ll see that you’re using the 14.11 MSVC compiler toolset.

Rebuild Solution

Using a side-by-side minor version MSVC toolset from the command line

If you need to use a side-by-side minor version MSVC toolset from the command line you just need to customize a developer command prompt. The command prompts installed with VS2017 version 15.5 Preview 4 are located in the VCAuxiliaryBuild subdirectory of your VS install dir. For example, with the default installation path, they are located in the C:Program Files (x86)Microsoft Visual StudioPreviewEnterpriseVCAuxiliaryBuild directory.

In that folder you’ll find four developer command prompts (named vcvars*.bat). Pick any one and create a copy to edit. The contents of these files are pretty simple: they all just invoke vcvarsall.bat with the proper architecture parameter. We’ll do the same, but add a new parameter that tells vcvarsall.bat to set up the environment for the v14.11 toolset: -vcvars_ver=14.11.

Here’s an example of a command to set up the environment for the v14.11 x86-hosted, x64-targeting tools. Running the command cl -Bv shows that the environment is set up for the right version of the tools.

Side-by-side toolset command prompt

Contact us!

Usually at the end of our blog posts we encourage you to try out the feature we’ve discussed. In this case, we’re doing the opposite. If you think you’ve run into an issue with the MSVC toolset in VS2017 version 15.5 Preview 4 that can’t be worked around in sources, please contact us. We’d like to know about your issue and try to help you address it without having to fall back to an older MSVC toolset. But if you do need the option to install the older MSVC toolset side-by-side with the current toolset, it’s available to you.

As always, we can be reached via the comments below, via email (visualcpp@microsoft.com) and you can provide feedback via Help > Report A Problem in the product, or via Developer Community. You can also find us on Twitter (@VisualC) and Facebook (msftvisualcpp).

Welcome to C# 7.2 and Span

$
0
0

C# 7.2 is the latest point release of C#, and adds a number of small but useful features.

All the features are described in wonderful detail in the docs. Start with the overview, What’s new in C# 7.2, which gives you an excellent introduction to the new set of capabilities. It is worth celebrating that a significant portion of the docs are community contributed, not least the material on the new private protected access modifier.

The dominant theme of C# 7.2 centers on improving expressiveness around working with structs by reference. This is important in particular for high performance scenarios, where structs help avoid the garbage collection overhead of allocation, whereas struct references help avoid excessive copying.

The docs go into detail on this set of features in Reference semantics with value types, and they are shown in my new “talking head” video on Channel 9, New Features in C# 7.1 and C# 7.2.

Several of these features, while generally useful, were added to C# 7.2 specifically in support of the new Span<T> family of framework types. This library offers a unified (and allocation-free) representation of memory from a multitude of different sources, such as arrays, stack allocation and native code. With its slicing capabilities, it obviates the need for expensive copying and allocation in many scenarios, such as string manipulation, buffer management, etc, and provides a safe alternative to unsafe code. This is really a game changer, in my opinion. While you may start out using it mostly for performance-intensive scenarios, it is lightweight enough that I think many new idioms will evolve for using it in every day code.

Jared Parsons gives a great introduction in his Channel 9 video C# 7.2: Understanding Span. In the December 15 issue of the MSDN Magazine, Stephen Toub will go into even more detail on Span and how to use it.

C# 7.2 ships with the 15.5 release of Visual Studio 2017.

Enjoy C# 7.2 and Span, and happy hacking!

Mads Torgersen, Lead Designer of C#

Keep Your Skills Up to Date: New Training and Azure Resources

$
0
0

Finding better ways to upskill is a consistent topic that comes up when we talk to you about what’s top of mind. It’s no wonder when the one constant in our industry is change with new techniques, frameworks, tools, and languages emerging all the time.

Developers by nature are extremely self-reliant when it comes to dealing with this change. Over 90% of the respondents on StackOverflow’s 2017 Developer Survey indicated that they were at least partially self-taught. Whether through coding boot camps, online courses, online forums/Q+A, or joining an OSS project, you are using a variety of ways to level up your skills and hone your craft for work and for fun.

What kind of learning do developers recommend

Source: 2017 Stack Overflow Developer Survey – https://insights.stackoverflow.com/survey/2017#education

With this in mind, we’re always looking for ways to help you invest in your own development. Today I’m happy to announce several new resources that we hope will do just that:

Dive into Data Science

One topic of particular interest lately for many is data science – more and more of you are being asked to work on data-intensive projects or are simply interested in building a stronger foundation of knowledge to take on more advanced analytics and ML projects in the future. We’re pleased to offer a subscription to DataCamp which allows you to master data analysis from any browser, at your own pace, tailored to your needs and expertise. Whether learning R, Python or Data Visualization, DataCamp provides an intuitive learning platform with interactive lessons, engaging videos, with no need for software to install or hardware requirements. All Visual Studio subscribers receive a complimentary 3-month subscription and Dev Essentials members receive a complimentary 2-month subscription to DataCamp.

DataCamp Course and Track Listing

Explore New Career Moves while Building Technical and Non-Technical Skills

We’re excited to partner with LinkedIn for the first time to bring our subscribers LinkedIn Learning. Building off Lynda.com’s catalog LinkedIn Learning brings together technical training with interesting non-technical subjects like leadership, management, marketing and design (many of which we hope will be of interest to all of you, but especially those looking to pursue an entrepreneurial path). On top of this LinkedIn Learning can help you connect with others in your area of specialization, explore new areas, and get job/salary insights. Visual Studio Enterprise subscribers receive a complimentary 6-month LinkedIn Learning subscription and Visual Studio Professional, Visual Studio Test Professional and MSDN Platforms subscribers and Dev Essentials members receive a complimentary 3-month subscription to LinkedIn Learning

LinkedIn Learning

Build Hard Core Technical Skills

As many of you know we’ve partnered with Pluralsight for the past few years to provide our subscribers with on-demand access to a digital ecosystem of learning tools, skill tests, interactive labs and live mentoring. Today, we are announcing changes to the current benefit: for a limited time, all Visual Studio subscribers will have full access to the entire training catalog from Pluralsight. Enterprise subscribers receive a complimentary 6-month subscription, and Professional, Test Pro and MSDN Platforms subscribers get 3-month complimentary subscriptions. Visual Studio Dev Essentials members continue to receive 3-month subscriptions to access Pluralsight’s full catalog.

Pluralsightss full catalog

Faster Access to Help when you need It

Getting your Azure development questions answered quickly is important. To help, we’re announcing two new resources:

  • Azure Forums: We are now offering one day response time to questions across 21 forum topics ranging from Azure API Management to Azure SQL Databases. Staffed by community members and the Azure engineering team, subscribers can trust their question will be answered by Azure experts. Check out the Azure Community support tile on my.visualstudio.com to access these forums.
  • Azure Advisory Chat: Azure Advisory Chat is a helpful resource for you as you build Azure solutions using the Visual Studio tools and services.  Chat is a real-time way to get guidance on any Azure questions, big or small.  Our expert support engineers are available 24 hours a day (M-F) to help you whether you are just getting started or already deploying business-critical workloads on Azure.  Support engineers will work with subscribers to set up new tenants, provide basic support for identity and networking, discuss approaches to application migrations, and much more.  Visual Studio Enterprise and Visual Studio Professional subscribers now have access to this benefit.

Activate your new benefits to get started right away.  Visit the Visual Studio site, to learn more about our developer subscriptions and programs. For additional information or help with activation of any of these, please visit our documentation site.

If you missed any of the event or want to watch the on-demand trainings, check out the Connect(); page.

We want to hear from you. Let us know what you’d like to see by sharing your feedback, suggestions, thoughts, and ideas in the comments below!

Shawn Nandi, Senior Director – Developer Programs, Partnerships and Planning – Cloud App Dev, Data, and AI Product Marketing
@ShawnNandi

Shawn drives partnerships and business planning for the developer business at Microsoft as well as product marketing for developer programs and subscriptions including Visual Studio Subscriptions and Dev Essentials.

Improvements to Azure Functions in Visual Studio

$
0
0

We’re excited to announce several improvements to the Azure Functions experience in Visual Studio as part of the latest update to the Azure Functions tools on top of Visual Studio 2017 v15.5. (Get the preview now.)

New Function project dialog

To make it easier to get up and running with Azure Functions, we’ve introduced a new Functions project dialog. Now, when creating a Functions project, you can choose one that starts with the one of the most popular trigger types (Http, Queue or Timer). If you’re looking for something different choose the Empty project, then add the item after project creation.

Additionally, most Function apps require a valid storage account to be specified in AzureWebJobsStorage. Typically this has meant adding a connection string to the local.settings.json after the function is created. To make it easier to find and configure the connection strings for your Function’s storage account, we’ve introduced a Storage Account picker in the new project dialog.

Storage account picker in new Functions project dialog

The default option is the Storage Emulator. The Storage Emulator is a local service, installed as part of the Azure workload, that offers much of the functionality of a real Azure storage account. If it’s not already running, you can start it by pressing the Windows Start key and typing “Microsoft Azure Storage Emulator”. This is a great option if you’re looking to get up and running quickly – especially if you’re playing around, as it doesn’t require any resources to be provisioned in Azure.

However, the best way to guarantee that all supported features are available to your Functions project is to configure it to use an Azure storage account. To help with this, we’ve added a Browse… option in the Storage Account picker that launches the Azure Storage Account selection dialog. This lets you choose from existing storage accounts that you have access to through your Azure subscriptions.

When the project is created, the connection string for the selected storage account will be added to the local.settings.json file and you’ll be able to run your Functions project straight away!

.NET Core support

You can now create Azure Functions projects inside Visual Studio that target .NET Core. When creating a Functions project, you can choose a target from the selector at the top of the new project dialog. If you choose the Azure Functions v2 (.NET Standard) target, your project will run against .NET Core or .NET Framework.

Choose Azure Functions runtime

Manage Application Settings

An important part of deploying Functions to Azure is adding appropriate application settings. Azure Functions projects store local settings in the local.settings.json file, but this file does not get published to Azure (by design). So, the settings that control the application running in Azure need to be manually configured. As part of our new tooling improvements, we’ve added the ability for you to view and edit your Function’s app settings in the cloud from within Visual Studio. On the Publish page of the Connected Services dialog, you’ll find an option to Manage Application Settings….

Manage App Settings link in Publish dialog

This launches the Application Settings dialog, which allows you to view, update, add and remove app settings just like you would on the Azure portal. When you’re satisfied with the changes, you can press Apply to push the changes to the server.

Application Settings editor

Detect mismatching Functions runtime versions

To prevent the issue where you are developing locally against an out-of-date version of the runtime, now, after publishing a Functions app, we’ll compare your local runtime version against the portal’s version. If they are different, Visual Studio will offer to change the app settings on the cloud to match the version you are using locally.

Update mismatching Functions extension version

Try out the new features

Download the latest version of Visual Studio 2017 (v15.5) and start enjoying the improved Functions experience today.

Ensure you have the Azure workload installed and the latest version of the Azure Web Jobs and Functions Tools.
Note: If you have a fresh installation, you may need to manually apply the update to Azure Functions and Web Jobs Tools. Look for the new notifications flag in the Visual Studio title bar. Clicking the link in the Notifications window opens the Extensions and Updates dialog. From there you can click Update to upgrade to the latest version.

Update notifications

If you have any questions or comments, please let us know by posting in the comments section below.

Announcing F# support for .NET Core and .NET Standard projects in Visual Studio

$
0
0

We’re pleased to announce that Visual Studio 2017 15.5 Preview 4 now supports F# projects targeting .NET Core, .NET Standard, and .NET Framework through the .NET Core SDK. Some of you have noticed various levels of this support in the first, second, and third previews. We still had a few work items left to complete when those were released, so we didn’t announce the support at that time. These work items are now complete!

Get started

Firstly, support for F# is now included by default in any Visual Studio workload which requires the .NET Core SDK. The .NET Core workload, ASP.NET workload, and Azure workload will all now install F# support by default:

If you install any of these workloads, you can then create new .NET Core/.NET Standard projects with File | New Project:

The current preview has support for Console Apps, Libraries, and Unit test projects.

Existing projects created via the .NET Core CLI can also be opened in Visual Studio. Here is a video of an existing project I have that I didn’t create in Visual Studio 2017:

What’s new

There are several new features in this new project support, and if you’ve used it with C# and .NET Core or .NET Standard projects, you’ll likely know about all of them. Here are a few that I like:

  • Project files are significantly smaller, often by an order of magnitude.
  • Project files are editable without having to unload the project.
  • You can edit a project file (e.g., adding a package) and when it’s saved, the project system will automatically react to those changes (such as restoring an added package).
  • NuGet dependencies, the SDK reference, and project-to-project references are unified under the Dependencies node.

Expanded functionality with Web SDK projects

The project I opened used the Giraffe library, which provides functional API routing atop the Kestrel HTTP Server. Because this project type uses the Microsoft.NET.Sdk.Web attribute in its project file, it is rendered as a web project in the UI. But that’s not all:

Application publish tooling all works with F# projects as well. Publishing this web app to Azure App Service through Visual Studio works the same way as C# projects. But it doesn’t end there. You can also use these projects with Visual Studio Team Services to set up a CI/CD pipeline:

The Continuous Delivery Tools for Visual Studio can be used on these F# project types to autogenerate a CI/CD pipeline which publishes to Azure App Service. There’s little bit of configuration left to do in the generated Build and Release Definitions after auto-generating them, but it’s very simple to tweak.

Naturally, you can configure the release definition to publish wherever you like – it just uses Azure in the Visual Studio tooling that generates the definitions right now. As other teams at Microsoft continue to expand this sort of functionality, these F# projects will see all those benefits.

Get started with F# and Web SDK projects

You can easily create Web SDK projects in F# with the .NET Core CLI, which also ships with Visual Studio. The following three commands will create projects using ASP.NET Core:

dotnet new webapi -lang F#
dotnet new web -lang F#
dotnet new mvc -lang F#

Project templates for these project types will be available in a future release of Visual Studio 2017. Community templates which use the web SDK (such as the Giraffe template) will also have all of the above benefits.

The road ahead for F# and .NET Core

Although we’ve come a long way with F# and .NET Core, there are still some remaining issues left for us to address:

  • After creating a new project-to-project reference, the Error List in Visual Studio can show errors, even though you get IntelliSense and your project builds. You can close/reopen documents for the errors to go away.
  • Solution Explorer does not show the compilation order of files after you first add them and move them in the project file. If you reload a project, ordering will be shown correctly.
  • Folder support is limited – all folders are rendered above files in the root directory of a project. We’re going to work out a design on how to handle this gracefully with file ordering in mind.
  • C# libraries must first be built before their symbols are visible in F# projects.
  • F# and ASP.NET Core templates are not yet available in the File | New project dialog.
  • We’re going to continue improving our cross-platform debugging support, specifically in Portable PDB generation.
  • Performance of solution loading and cross-project IDE features in large .NET Core/.NET Standard solutions is still a place where major investments are being made by the team which owns the underlying project system F# uses for these project types. F# will benefit from all of these investments.
  • Type Provider support for .NET Core is arriving towards a design that will enable quality support on .NET Standard and .NET Core. This redesign will involve changes to Type Provider libraries.
  • We’ll also soon begin work on enabling F# Interactive for .NET Core. This will be nontrivial work, so we don’t have an estimated date of completion at this time.
  • There are no Visual Studio templates for F# Azure Functions yet. We’re also going to work with the Azure Functions team to ensure that F# templates for different Function types will be in a forthcoming release.

Finally, we are laying the groundwork for a long-term effort of migrating all F# projects to the new project system that .NET Core and .NET Standard projects use. This will enable us to retire our current, F#-specific project system that .NET Framework projects use today. We’ll first do this by “dogfooding” our own codebase. There is no timeline for when this will happen, as it is a long-term effort, but it is a major goal of ours because it enables us to hand off project system concerns to experts in that space so that we can focus on the F# compiler and F# tooling.

Send us feedback!

We’re very excited for you to use this new tooling support, and we highly appreciate everyone who finds issues and sends us general feedback. You can file issues on our GitHub repository, which is where we do most of our work. We’ll also move issues created with the Visual Studio feedback tool over to GitHub, so you can use that, too. Thank you, and happy F# coding!

Introducing Azure DevOps Project

$
0
0
In today’s world, organizations need to innovate and get to market faster. This requires learning latest technologies, using them in your product and deploying at a faster pace. Adopting Azure is one such scenario. Existing on-premise apps are getting migrated to Azure and new applications are getting developed to take advantage of Azure services. But... Read More

Introducing Nullable Reference Types in C#

$
0
0

Today we released a prototype of a C# feature called “nullable reference types“, which is intended to help you find and fix most of your null-related bugs before they blow up at runtime.

We would love for you to install the prototype and try it out on your code! (Or maybe a copy of it! 😄) Your feedback is going to help us get the feature exactly right before we officially release it.

Read on for an in-depth discussion of the design and rationale, and scroll to the end for instructions on how to get started!

The billion-dollar mistake

Tony Hoare, one of the absolute giants of computer science and recipient of the Turing Award, invented the null reference! It’s crazy these days to think that something as foundational and ubiquitous was invented, but there it is. Many years later in a talk, Sir Tony actually apologized, calling it his “billion-dollar mistake”:

I call it my billion-dollar mistake. It was the invention of the null reference in 1965. At that time, I was designing the first comprehensive type system for references in an object oriented language (ALGOL W). My goal was to ensure that all use of references should be absolutely safe, with checking performed automatically by the compiler. But I couldn’t resist the temptation to put in a null reference, simply because it was so easy to implement. This has led to innumerable errors, vulnerabilities, and system crashes, which have probably caused a billion dollars of pain and damage in the last forty years.

There’s general agreement that Tony is actually low-balling the cost here. How many null reference exceptions have you gotten over the years? How many of them were in production code that was already tested and shipped? And how much extra effort did it take to verify your code and chase down potential problems to avoid even more of them?

The problem is that null references are so useful. In C#, they are the default value of every reference type. What else would the default value be? What other value would a variable have, until you can decide what else to assign to it? What other value could we pave a freshly allocated array of references over with, until you get around to filling it in?

Also, sometimes null is a sensible value in and of itself. Sometimes you want to represent the fact that, say, a field doesn’t have a value. That it’s ok to pass “nothing” for a parameter. The emphasis is on sometimes, though. And herein lies another part of the problem: Languages like C# don’t let you express whether a null right here is a good idea or not.

Yet!

What can be done?

There are some programming languages, such as F#, that don’t have null references or at least push them to the periphery of the programming experience. One popular approach instead uses option types to express that a value is either None or Some(T) for a given reference type T. Any access to the T value itself is then protected behind a pattern matching operation to see if it is there: The developer is forced, in essence, to “do a null check” before they can get at the value and start dereferencing it.

But that’s not how it works in C#. And here’s the problem: We’re not going to add another kind of nulls to C#. And we’re not going to add another way of checking for those nulls before you access a value. Imagine what a dog’s breakfast that would be! If we are to do something about the problem in C#, it has to be in the context of existing nulls and existing null checks. It has to be in a way that can help you find bugs in existing code without forcing you to rewrite everything.

Step one: expressing intent

The first major problem is that C# does not let you express your intent: is this variable, parameter, field, property, result etc. supposed to be null or not? In other words, is null part of the domain, or is it to be avoided.

We want to add such expressiveness. Either:

  1. A reference is not supposed to be null. In that case it is alright to dereference it, but you should not assign null to it.
  2. A reference is welcome to be null. In that case it is alright to assign null to it, but you should not dereference it without first checking that it isn’t currently null.

Reference types today occupy an unfortunate middle ground where both null assignment and unchecked dereferencing are encouraged.

Naively, this suggests that we add two new kinds of reference types: “safely nonnullable” reference types (maybe written string!) and “safely nullable” reference types (maybe written string?) in addition to the current, unhappy reference types.

We’re not going to do that. If that’s how we went about it, you’d only get safe nullable behavior going forward, as you start adding these annotations. Any existing code would benefit not at all. I guess you could push your source code into the future by adding a Roslyn analyzer that would complain at you for every “legacy” reference type string in your code that you haven’t yet added ? or ! to. But that would lead to a sea of warnings until you’re done. And once you are, your code would look like it’s swearing at you, with punctuation? on! every? declaration!

In a certain weird way we want something that’s more intrusive in the beginning (complains about current code) and less intrusive in the long run (requires fewer changes to existing code).

This can be achieved if instead we add only one new “safe” kind of reference type, and then reinterpret existing reference types as being the other “safe” kind. More specifically, we think that the default meaning of unannotated reference types such as string should be non-nullable reference types, for a couple of reasons:

  1. We believe that it is more common to want a reference not to be null. Nullable reference types would be the rarer kind (though we don’t have good data to tell us by how much), so they are the ones that should require a new annotation.
  2. The language already has a notion of – and a syntax for – nullable value types. The analogy between the two would make the language addition conceptually easier, and linguistically simpler.
  3. It seems right that you shouldn’t burden yourself or your consumer with cumbersome null values unless you’ve actively decided that you want them. Nulls, not the absence of them, should be the thing that you explicitly have to opt in to.

Here’s what it looks like:

class Person
{
    public string FirstName;   // Not null
    public string? MiddleName; // May be null
    public string LastName;    // Not null
}

This class is now able to express the intent that everyone has a first and a last name, but only some people have a middle name.

Thus we get to the reason we call this language feature “nullable reference types”: Those are the ones that get added to the language. The nonnullable ones are already there, at least syntactically.

Step two: enforcing behavior

A consequence of this design choice is that any enforcement will add new warnings or errors to existing code!

That seems like a breaking change, and a really bad idea, until you realize that part of the purpose of this feature is to find bugs in existing code. If it can’t find new problems with old code, then it isn’t worth its salt!

So we want it to complain about your existing code. But not obnoxiously. Here’s how we are going to try to strike that balance:

  1. All enforcement of null behavior will be in the form of warnings, not errors. As always, you can choose to run with warnings as errors, but that is up to you.
  2. There’s a compiler switch to turn these new warnings on or off. You’ll only get them when you turn it on, so you can still compile your old code with no change.
  3. The warnings will recognize existing ways of checking for null, and not force you to change your code where you are already diligently doing so.
  4. There is no semantic impact of the nullability annotations, other than the warnings. They don’t affect overload resolution or runtime behavior, and generate the same IL output code. They only affect type inference insofar as it passes them through and keeps track of them in order for the right warnings to occur on the other end.
  5. There is no guaranteed null safety, even if you react to and eliminate all the warnings. There are many holes in the analysis by necessity, and also some by choice.

To that last point: Sometimes a warning is the “correct” thing to do, but would fire all the time on existing code, even when it is actually written in a null safe way. In such cases we will err on the side of convenience, not correctness. We cannot be yielding a “sea of warnings” on existing code: too many people would just turn the warnings back off and never benefit from it.

Once the annotations are in the language, it is possible that folks who want more safety and less convenience can add their own analyzers to juice up the aggresiveness of the warnings. Or maybe we add an “Extreme” mode to the compiler itself for the hardliners.

In light of these design tenets, let’s look at the specific places we will start to yield warnings when the feature is turned on.

Avoiding dereferencing of nulls

First let’s look at how we would deal with the use of the new nullable reference types.

The design goal here is that if you mark some reference types as nullable, but you are already doing a good job of checking them for null before dereferencing, then you shouldn’t get any warnings. This means that the compiler needs to recognize you doing a good job. The way it can do that is through a flow analysis of the consuming code, similar to what it currently does for definite assignment.

More specifically, for certain “tracked variables” it will keep an eye on their “null state” throughout the source code (either “not null” or “may be null“). If an assignment happens, or if a check is made, that can affect the null state in subsequent code. If the variable is dereferenced at a place in the source code where its null state is “may be null“, then a warning is given.

void M(string? ns)            // ns is nullable
{
    WriteLine(ns.Length);     // WARNING: may be null
    if (ns != null)
    {
        WriteLine(ns.Length); // ok, not null here
    }
    if (ns == null)
    {
        return;               // not null after this
    }
    WriteLine(ns.Length);     // ok, not null here
    ns = null;                // null again!
    WriteLine(ns.Length);     // WARNING: may be null
}

In the example you can see how the null state of ns is affected by checks, assignments and control flow.

Which variables should be tracked? Parameters and locals for sure. There can be more of a discussion around fields and properties in “dotted chains” like x.y.z or this.x, or even a field x where the this. is implicit. We think such fields and properties should also be tracked, so that they can be “absolved” when they have been checked for null:

void M(Person p)
{
    if (p.MiddleName != null)
    {
        WriteLine(p.MiddleName.Length); // ok
    }
}

This is one of those places where we choose convenience over correctness: there are many ways that p.MiddleName could become null between the check and the dereference. We would be able to track only the most blatant ones:

void M(Person p)
{
    if (p.MiddleName != null)
    {
        p.ResetAllFields();             // can't detect change
        WriteLine(p.MiddleName.Length); // ok

        p = GetAnotherPerson();         // that's too obvious
        WriteLine(p.MiddleName.Length); // WARNING: saw that!
    }
}

Those are examples of false negatives: we just don’t realize you are doing something dangerous, changing the state that we are reasoning about.

Despite our best efforts, there will also be false positives: Situations where you know that something is not null, but the compiler cannot figure it out. You get an undeserved warning, and you just want to shut it up.

We’re thinking of adding an operator for that, to say that you know better:

void M(Person p)
{
    WriteLine(p.MiddleName.Length);  // WARNING: may be null
    WriteLine(p.MiddleName!.Length); // ok, you know best!
}

The trailing ! on an expression tells the compiler that, despite what it thinks, it shouldn’t worry about that expression being null.

Avoiding nulls

So far, the warnings were about protecting nulls in nullable references from being dereferenced. The other side of the coin is to avoid having nulls at all in the nonnullable references.

There are a couple of ways null values can come into existence, and most of them are worth warning about, whereas a couple of them would cause another “sea of warnings” that is better to avoid:

  1. Assigning or passing null to a non-nullable reference type. That is pretty egregious, right? As a general rule we should warn on that (though there are surprising counterarguments to some cases, still under debate).
  2. Assigning or passing a nullable reference type to a nonnullable one. That’s almost the same as 1, except you don’t know that the value is null – you only suspect it. But that’s good enough for a warning.
  3. A default expression of a nonnullable reference type. again, that is similar to 1, and should yield a warning.
  4. Creating an array with a nonnullable element type, as in new string[10]. Clearly there are nulls being made here – lots of them! But a warning here would be very harsh. Lots of existing code would need to be changed – a large percentage of the worlds existing array creations! Also, there isn’t a really good work around. This seems like one we should just let go.
  5. Using the default constructor of a struct that has a field of nonnullable reference type. This one is sneaky, since the default constructor (which zeroes out the struct) can even be implicitly used in many places. Probably better not to warn, or else many existing struct types would be rendered useless.
  6. Leaving a nonnullable field of a newly constructed object null after construction. This we can do something about! Let’s check to see that every constructor assigns to every field whose type is nonnullable, or else yield a warning.

Here are examples of all of the above:

void M(Person p)
{
    p.FirstName = null;          // 1 WARNING: it's null
    p.LastName = p.MiddleName;   // 2 WARNING: may be null
    string s = default(string);  // 3 WARNING: it's null
    string[] a = new string[10]; // 4 ok: too common
}

struct PersonHandle
{
    public Person person;        // 5 ok: too common
}

class Person
{
    public string FirstName;     // 6 WARNING: uninitialized
    public string? MiddleName;
    public string LastName;      // 6 WARNING: uninitialized
}

Once again, there will be cases where you know better than the compiler that either a) that thing being assigned isn’t actually null, or b) it is null but it doesn’t actually matter right here. And again you can use the ! operator to tell the compiler who’s boss:

void M(Person p)
{
    p.FirstName = null!;        // ok, you asked for it!
    p.LastName = p.MiddleName!; // ok, you handle it!
}

A day in the life of a null hunter

When you turn the feature on for existing code, everything will be nonnullable by default. That’s probably not a bad default, as we’ve mentioned, but there will likely be places where you should add some ?s.

Luckily, the warnings are going to help you find those places. In the beginning, almost every warning is going to be of the “avoid nulls” kind. All these warnings represent a place where either:

  1. you are putting a null where it doesn’t belong, and you should fix it – you just found a bug! – or
  2. the nonnullable variable involved should actually be changed to be nullable, and you should fix that.

Of course as you start adding ? to declarations that should be allowed to be null, you will start seeing a different kind of warnings, where other parts of your existing code are not written to respect that nullable intent, and do not properly check for nulls before dereferencing. That nullable intent was probably always there but was inexpressible in the code before.

So this is a pretty nice story, as long as you are just working with your own source code. The warnings drive quality and confidence through your source base, and when you’re done, your code is in a much better state.

But of course you’ll be depending on libraries. Those libraries are unlikely to add nullable annotations at exactly the same time as you. If they do so before you turn the feature on, then great: once you turn it on you will start getting useful warnings from their annotations as well as from your own.

If they add anotations after you, however, then the situation is more annoying. Before they do, you will “wrongly” interpret some of their inputs and outputs as non-null. You’ll get warnings you didn’t “deserve”, and miss warnings you should have had. You may have to use ! in a few places, because you really do know better.

After the library owners get around to adding ?s to their signatures, updating to their new version may “break” you in the sense that you now get new and different warnings from before – though at least they’ll be the right warnings this time. It’ll be worth fixing them, and you may also remove some of those !s you temporarily added before.

We spent a large amount of time thinking about mechanisms that could lessen the “blow” of this situation. But at the end of the day we think it’s probably not worth it. We base this in part on the experience from TypeScript, which added a similar feature recently. It shows that in practice, those inconveniences are quite manageable, and in no way inhibitive to adoption. They are certainly not worth the weight of a lot of extra “mechanism” to bridge you over in the interim. The right thing to do if an API you use has not added ?s in the right places is to push its owners to get it done, or even contribute the ?s yourself.

Become a null hunter today!

Please install the prototype and try it out in VS!

Go to github.com/dotnet/csharplang/wiki/Nullable-Reference-Types-Preview for instructions on how to install and give feedback, as well as a list of known issues and frequently asked questions.

Like all other C# language features, nullable reference types are being designed in the open here: github.com/dotnet/csharplang.
We look forward to walking the last nullable mile with you, and getting to a well-tuned, gentle and useful null-chasing feature with your help!

Thank you, and happy hunting!

Mads Torgersen, Lead Designer of C#


Introducing Tensor for multi-dimensional Machine Learning and AI data

$
0
0

Overview

Tensor is an exchange type for homogenous multi-dimensional data for 1 to N dimensions. The motivation behind introducing Tensor<T> is to make it easy for Machine Learning library vendors like CNTK, Tensorflow, Caffe, Scikit-Learn to port their libraries over to .NET with minimal dependencies in place.  Tensor<T> is designed to provide the following characteristics.

  • Good exchange type for multi-dimensional machine-learning data
  • Support for different sparse and dense layouts
  • Efficient interop with native Machine Learning Libraries using zero copies
  • Work with any type of memory (Unmanaged, Managed)
  • Allow for slicing and indexing Machine Learning data efficiently
  • Basic math and manipulation
  • Fits existing API patterns in the .NET framework like Vector<T>
  • Supports for .NET Framework, .NET Core, Mono, Xamarin & UWP
  • Doesn’t replace existing Math or Machine Learning libraries. Instead the goal is to give those libraries a sufficient exchange type for multi-dimensional data to avoid reimplementing, copying data, or depending on one another.

If you familiar with recent work with Memory<T>, Tensor<T> can be thought of as extending Memory<T> for multiple dimensions and sparse data. Primary attributes for defining a Tensor<T> include:

  • Type T: Primitive (e.g. int, float, string…) and non-primitive types
  • Dimensions: The shape of the Tensor (e.g. 3×5 matrix, 28x28x3, etc.)
  • Layout (Optional): Row vs. Column major. The API calls this “ReverseStride” to generalize to many dimensions instead of just “row” and “column”. The default is reverseStride: false (i.e. row-major).
  • Memory (Optional): The backing storage for the values in the Tensor, which is used in-place without copying.

You can try out Tensor<T> with CNTK for the Pre-built MNIST model by following this GitHub repo.

Types of Tensor

There are currently three tensor types that we support as a part of this work. The following table summarizes Dense, Sparse and CompressedSparse Tensor along with providing performance characteristics and recommended usage patterns for each one of them.

 

Dense Tensor Compressed Sparse Tensor

Sparse Tensor

Description Stores values in a contiguous sequential block of memory where all values are represented.

 

 

 

 

 

A tensor that where only the non-zero values are represented.

For a two-dimensional tensor this is referred to as compressed sparse row (CSR, CRS, Yale), compressed sparse column (CSC, CCS).

CompressedSparseTensor is great for interop and serialization but bad for mutation due to reallocation and data shifting.

 

 

A tensor that where only the non-zero values are represented.

SparseTensor’s backing storage is a Dictionary<int,T> where the key is the linearized index of the n-dimension indices.

SparseTensor is meant as an intermediate to be used to build other Tensors, such as CompressedSparseTensor. UnlikeCompressedSparseTensor where insertions are O(n), insertions to SparseTensor<T> are nominally O(1).

Create O(n) O(capacity) O(capacity)
Memory O(n) O(nnz) O(nnz)
Access O(1) O(log nonCompressedDimensions) O(1)
Insert/Remove N/A O(nnz) O(1)
Similarity numpy.ndarray sparse.csr_matrix/ sparse.csc_matrix in scipy in scipy sparse.dok_matrix in scipy
Example  (C#) var denseTensor = new DenseTensor<int>(new[] { 3, 5 }); var compressedSparseTensor = new CompressedSparseTensor<int>(new[] { 3, 5 }); var sparseTensor = new SparseTensor<int>(new[] { 3, 5 });

One of the additional advantages of using Tensor is that Tensor would do the offset calculation (stride) for you based upon the shape and the dimensions specified during Tensor allocation, this will save you the added complexity that comes with doing the offset calculation yourself. The example below, illustrates this in detail.

Tensor Stride Calculation

Current Limitations & Non-goals

  • The current arithmetic operations are rudimentary and bare essentials at this point.
  • There hasn’t been any optimization effort applied. We are considering if these are necessary for a “v1” of Tensor or if they can be trimmed from the initial release.
  • With Tensor<T> we are not looking to build a canonical linear-algebra library or machine learning implementation into .NET.
  • One of the other non-goals with Tensor<T> is build device-specific memory allocation and arithmetic into Tensor e.g. GPU, FPGA. We imagine these scenarios can be implemented in a library outside of the Tensor library.

Future Work

  • Potentially growing the suite of manipulations and arithmetic to be a good, stable, useful set.
  • Improve performance of these methods using vectorization, and specialized CPU instructions like SIMD.
  • Grow the set of implementations of tensor to represent all common layouts
  • Continue working on open issues on our  Tensor<T> github repo.

Interested to learn more about how to infuse AI and ML for .NET?

If you are interested to learn more about the many ways you can infuse AI and ML into your .NET apps you can also watch this high-level overview video that we have to put together.

In general, we would love for you to try out Tensor with this getting started sample and provide us feedback to help influence the future design of Tensor. We welcome your feedback! If you have comments about how this guide might be more helpful, please leave us a note below or reach out to us directly over mail.

Happy coding from the .NET team!

Windows 10 at Microsoft Connect(); 2017

$
0
0

Today, at Microsoft Connect(); we spoke about the work we have been doing to align the concepts and tags that can be shared between Microsoft’s XAML UI systems.  With Windows 10 XAML and Xamarin.Forms, we expose the full and unique capabilities of each platform. Xamarin.Forms focuses on native mobile experiences and exposing the common subset of controls and capabilities needed most by mobile developers. Windows 10 XAML and WPF are optimized for native Windows experiences, including the most rich and demanding experiences optimized for use with mouse, keyboard and touch. We are releasing a preview of Xamarin.Forms which includes additional APIs that advance XAML Standard, including new types names and properties for common elements you will recognize from Windows 10 XAML. I encourage you to check out the APIs at: aka.ms/xamlstandard and provide us with your feedback. We look forward to working with you.

Connect(); Sessions

Tomorrow, on November 16th, you can view two Connect(); 2017 live sessions online to learn more about how to create beautiful and engaging applications. In “Engaging with your customers on any platform using the Microsoft Graph, Activity Feed, and Adaptive Cards” you’ll learn how customers’ lives are made more seamless through Microsoft Graph and Project Rome. Allowing customers to use Windows, Android and iOS together ensures continuity in busy lifestyles. Next, you’ll learn about the benefits Bots and Adaptive Cards bring to that same customer, as well as the businesses they are interacting with. And finally, we’ll show you how unique Windows experiences like Notifications and Timeline run smoothly and are quick and easy to integrate with Adaptive Cards. Be sure to tune in at 11 a.m. PST to learn about all of these rich features.

In our next session, also on November 16 at 12pm PST, we show you how to make your applications more beautiful. “Building amazing applications with the Fluent Design System” helps you transform your applications to be engaging, modern and visually rich. It includes the UX building blocks, guidelines, tools and end-to-end support developers need. You will learn how easy it is to use familiar technologies like XAML and C# to create applications that bring the Fluent Design System to life on Windows 10 across a range of devices and inputs. We will cover the different elements of the design system and how to use the latest controls, animations, effects and other platform capabilities to captivate your customers and maximize their productivity. To learn more about the features, visit: http://developer.microsoft.com/design.

Wrapping up

Last month, we shipped the Windows 10 Fall Creators Update SDK – we created tools and APIs to help developers build applications that will make customers want to engage and re-engage with your application – driving future growth and retention. We are constantly taking in your feedback and making Windows 10 a better place for all developers.

The post Windows 10 at Microsoft Connect(); 2017 appeared first on Building Apps for Windows.

C++ Core Check improvements in Visual Studio 2017 15.5

$
0
0

This post was written by Sergiy Oryekhov.

In Visual Studio 2017 version 15.5 Preview 4 we have refreshed our C++ Core Guidelines Check extension for native code static analysis tools. Most of the work since 15.3 has been focused on new rules that will help developers starting new projects write safer C++ and those with legacy projects move toward safer, modern C++. For those who are new to the core guidelines checks, we recommend to review the latest version of the working draft: “C++ Core Guidelines“.

This article provides an overview of the changes in VS2017 15.5. For more information about the current release, please see the list of supported rule sets on docs.microsoft.com.

New rule sets

In the previous release we introduced several rule sets to allow customers to narrow code analysis results. The rules implemented for 15.5 extended some of the existing rule sets (Raw Pointer, Owner Pointer, Unique Pointer, Type), and also introduced a few new categories:

  • Class rules: This section includes a few rules mostly focused on proper use of special methods and virtual specifications. This is a subset of checks recommended for classes and class hierarchies.
  • Concurrency rules: This currently includes a single rule which catches badly declared guard objects. For more information see Core Guidelines related to concurrency.
  • Declaration rules: Here are a couple of rules from the interfaces guidelines which focus on how global variables are declared.
  • Function rules: These are two checks that help with adoption of the noexcept specifier. This is a part of the guidelines for clear function design and implementation.
  • Shared pointer rules: As a part of resource management guidelines enforcement, we added a few rules specific to how shared pointers are passed into functions or used locally.
  • Style rules: In this release we have one simple but important check which bans use of goto. This is the first step in improving of coding style and use of expressions and statements in C++. While there are exceptions in the C++ Core Guideline about goto, proper use of the construct is rare enough that it deserves review.

New rules in each set

  • Class rules

    • C26432 DEFINE_OR_DELETE_SPECIAL_OPS Special operations (e.g. destructors or copy constructors) imply special behavior and should come in complete sets to define such behavior clearly.
    • C26436 NEED_VIRTUAL_DTOR Having virtual methods suggests polymorphic behavior which requires more careful management of object cleanups.
    • C26434 DONT_HIDE_METHODS Hiding methods by names is like hiding variables. Naming should not lead to ambiguity.
  • Concurrency rules

  • Declaration rules

    • C26426 NO_GLOBAL_INIT_CALLS Calling a function from the initializer for a global variable may lead to unexpected results due to undefined order of initialization.
    • C26427 NO_GLOBAL_INIT_EXTERNS Global variables should not refer to external symbols to avoid initialization order problems.
  • Function rules

    • C26439 SPECIAL_NOEXCEPT Some of the special functions (like destructors) should avoid throwing exceptions.
    • C26440 DECLARE_NOEXCEPT If a function neither throws nor calls other functions that can throw, it should be marked as noexcept.
  • Resource management rules

    • C26416 NO_RVALUE_REF_SHARED_PTR Passing shared pointers by rvalue-reference is unnecessary and usually indicates a misuse of shared pointers. Shared pointers are safe and inexpensive to pass by value.
    • C26417 NO_LVALUE_REF_SHARED_PTR A shared pointer passed by reference acts as an output parameter, and it is expected that its ownership will be updated in the function (e.g. by calling reset()). If the shared pointer is only used to access its contained object, a plain reference or pointer to the contained object should be passed instead.
    • C26418 NO_VALUE_OR_CONST_REF_SHARED_PTR When a shared pointer is passed by value or reference to const, it indicates to the caller that the function needs to control the lifetime of its contained object without affecting the calling code. However, if the smart pointer is never copied, moved, or otherwise modified in a way that will affect the contained object’s lifetime, a plain reference or pointer to the contained object should be passed instead.
    • C26415 SMART_PTR_NOT_NEEDED Smart pointers are convenient for resource management, but when they are used only to access the contained object, the code may be simplified by passing plain references or pointers to the contained object instead.
    • C26414 RESET_LOCAL_SMART_PTR Using a local smart pointer implies the function needs to control the lifetime of the contained object. If a function does not use the smart pointer to pass ownership outside of the function and has no explicit calls to change ownership, a stack-allocated local variable should be used instead to avoid an unnecessary heap allocation.
    • C26429 USE_NOTNULL If a pointer is dereferenced but never tested for nullness, it may be useful to use gsl::not_null so that assumptions about its validity are properly asserted.
    • C26430 TEST_ON_ALL_PATHS If a pointer is dereferenced and tested in at least one path, the code should ensure it is tested on all paths since testing implies possibility that the pointer is null.
    • C26431 DONT_TEST_NOTNULL Testing for nullness of expressions of type gsl::not_null is obviously unnecessary.
  • Style rules

    • C26438 NO_GOTO Modern C++ should never use goto in user-written code.
  • Type rules

    • C26437 DONT_SLICE Even though compiler allows implicit slicing, it is usually unsafe and unmaintainable.
    • C26472 NO_CASTS_FOR_ARITHMETIC_CONVERSION Static casts can silently discard data which doesn’t fit into an arithmetic type.
    • C26473 NO_IDENTITY_CAST Casting between pointers of exactly same type is obviously unnecessary.
    • C26474 NO_IMPLICIT_CAST Casting should be omitted in cases where pointer conversion is done implicitly. Note, the rule ID is a bit misleading: it should be interpreted as “implicit cast is not used where it is acceptable”.
    • C26475 NO_FUNCTION_STYLE_CASTS Function-style cast is another form of a C-style cast and can lead to silent data truncation.

Warnings that were rearranged

Some warning numbers found in the VS2017 version 15.3 release are no longer available in VS2017 version 15.5. These warnings did not disappear, but were rather replaced with more specific checks. The primary goal was to separate particularly common patterns within a warning into separate warnings.

In closing

Good tools can help you to maintain and upgrade your code. The C++ Core Guidelines are a great place to start, and the C++ Core Guidelines Checker can help you to clean up your code and keep it clean. Try out the expanded C++ Core Guidelines Checker in Visual Studio 2017 version 15.5 and let us know what you think.

If you have any feedback or suggestions for us, let us know. We can be reached via the comments below, via email (visualcpp@microsoft.com) and you can provide feedback via Help > Report A Problem in the product, or via Developer Community. You can also find us on Twitter (@VisualC) and Facebook (msftvisualcpp).

Good practices for sharing data in spreadsheets

$
0
0

Spreadsheets are powerful tools with many applications: collecting data, sharing data, visualizing data, analyzing data, reporting on data. Sometimes, the temptation to do all of these things in a single workbook is irresistible. But if your goal is to provide data to others for analysis, then features that are useful for, say, reporting are downright detrimental to the task of data analysis. To make things easier on your downstream analysts, and to reduce the risk of inadvertent errors that can be caused by spreadsheets, Karl Broman and Kara Woo have published a paper Data organization in spreadsheets chock-full of useful advice. To reiterate from their introduction:

Spreadsheets are often used as a multipurpose tool for data entry, storage, analysis, and visualization. Most spreadsheet programs allow users to perform all of these tasks, however we believe that spreadsheets are best suited to data entry and storage, and that analysis and visualization should happen separately. Analyzing and visualizing data in a separate program, or at least in a separate copy of the data file, reduces the risk of contaminating or destroying the raw data in the spreadsheet.

The paper provides a wealth of helpful tips for making data in spreadsheets ready for analysis:

Iso_8601Basic Data Practices: use consistent data codes and variable names, for example.

Naming Practices: don't use spaces in column names or file names, and make those names meaningful.

Dealing with Dates: avoid a common gotcha by using the ISO 8601 standard for dates, as shown in the XKCD comic to the right.

Representing missing data: don't use empty cells for missing data; use a hyphen or a unique code like NA (but not a number like -9999).

Don't overload the cells. Don't try and pack more than one piece of information into a cell (and don't merge cells, either). In particular, don't encode useful information with color or font; add another column instead.

Follow tidy data principles. Make the data rectangular. If you need multiple rectangles, use multiple files or worksheets. If your data still doesn't fit into that format, a spreadsheet probably wasn't the best place for it in the first place.

Don't make calculations in data files. You're preparing this data for analysis, so avoid the temptation to include formulas to create new data. Just provide the raw data.

You can find details, examples, and lots more useful advice in the complete paper, which I encourage everyone to read at the link below.

The American Statistician: Data organization in spreadsheets (2017), Karl W. Broman and Kara H. Woo (via Sharon Machlis)

Windows 10 SDK Preview Build 17035 now available

$
0
0

Today, we released a new Windows 10 Preview Build of the SDK to be used in conjunction with Windows 10 Insider Preview (Build 17035 or greater). The Preview SDK Build 17035 contains bug fixes and under development changes to the API surface area.

The Preview SDK can be downloaded from developer section on Windows Insider.

For feedback and updates to the known issues, please see the developer forum. For new developer feature requests, head over to our Windows Platform UserVoice.

Things to note:

  • This build works in conjunction with previously released SDKs and Visual Studio 2017. You can install this SDK and still also continue to submit your apps that target Windows 10 Creators build or earlier to the store.
  • The Windows SDK will now formally only be supported by Visual Studio 2017 and greater. You can download the Visual Studio 2017 here.

Known Issues:

What’s New:

  • C++/WinRT Now Available:
    The C++/WinRT headers and cppwinrt compiler (cppwinrt.exe) are now included in the Windows SDK. The compiler comes in handy if you need to consume a third-party WinRT component or if you need to author your own WinRT components with C++/WinRT. The easiest way to get working with it after installing the Windows Insider Preview SDK is to start the Visual Studio Developer Command Prompt and run the compiler in that environment. Authoring support is currently experimental and subject to change. Stay tuned as we will publish more detailed instructions on how to use the compiler in the coming week. The ModernCPP blog has a deeper dive into the CppWinRT compiler. Please give us feedback by creating an issue at: https://github.com/microsoft/cppwinrt.

Breaking Changes

New MIDL key words. 

As a part of the “modernizing IDL” effort, several new keywords are added to the midlrt tool. These new keywords will cause build breaks if they are encountered in IDL files.

The new keywords are:

  • event
  • set
  • get
  • partial
  • unsealed
  • overridable
  • protected
  • importwinmd

If any of these keywords are used as an identifier, it will generate a build failure indicating a syntax error.

The error will be similar to:

1 >d:ossrconecorecomcombaseunittestastatestserverstestserver6idlremreleasetest.idl(12) : error MIDL2025 : [msg]syntax error [context]: expecting a declarator or * near “)”

To fix this, modify the identifier in error to an “@” prefix in front of the identifier. That will cause MIDL to treat the offending element as an identifier instead of a keyword.

API Updates and Additions

When targeting new APIs, consider writing your app to be adaptive in order to run correctly on the widest number of Windows 10 devices. Please see Dynamically detecting features with API contracts (10 by 10) for more information.

The following APIs have been added to the platform since the release of 16299.


namespace Windows.ApplicationModel {
  public enum StartupTaskState {
    EnabledByPolicy = 4,
  }
}
namespace Windows.ApplicationModel.Background {
  public sealed class MobileBroadbandPcoDataChangeTrigger : IBackgroundTrigger
}
namespace Windows.ApplicationModel.Calls {
  public enum PhoneCallMedia {
    AudioAndRealTimeText = 2,
  }
  public sealed class VoipCallCoordinator {
    VoipPhoneCall RequestNewAppInitiatedCall(string context, string contactName, string contactNumber, string serviceName, VoipPhoneCallMedia media);
    VoipPhoneCall RequestNewIncomingCall(string context, string contactName, string contactNumber, Uri contactImage, string serviceName, Uri brandingImage, string callDetails, Uri ringtone, VoipPhoneCallMedia media, TimeSpan ringTimeout, string contactRemoteId);
  }
  public sealed class VoipPhoneCall {
    void NotifyCallAccepted(VoipPhoneCallMedia media);
 }
}
namespace Windows.ApplicationModel.Chat {
  public sealed class RcsManagerChangedEventArgs
  public enum RcsManagerChangeType
  public sealed class RcsNotificationManager
}
namespace Windows.ApplicationModel.UserActivities {
  public sealed class UserActivity {
    public UserActivity();
  }
  public sealed class UserActivityChannel {
    public static void DisableAutoSessionCreation();
  }
  public sealed class UserActivityVisualElements {
    string AttributionDisplayText { get; set; }
  }
}
namespace Windows.Devices.PointOfService {
  public sealed class BarcodeScannerReport {
    public BarcodeScannerReport(uint scanDataType, IBuffer scanData, IBuffer scanDataLabel);
  }
  public sealed class ClaimedBarcodeScanner : IClosable {
    void HideVideoPreview();
    IAsyncOperation<bool> ShowVideoPreviewAsync();
  }
  public sealed class UnifiedPosErrorData {
    public UnifiedPosErrorData(string message, UnifiedPosErrorSeverity severity, UnifiedPosErrorReason reason, uint extendedReason);
  }
}
namespace Windows.Globalization {
  public static class ApplicationLanguages {
    public static IVectorView<string> GetLanguagesForUser(User user);
  }
  public sealed class Language {
    LanguageLayoutDirection LayoutDirection { get; }
  }
  public enum LanguageLayoutDirection
}
namespace Windows.Graphics.Imaging {
  public enum BitmapPixelFormat {
    P010 = 104,
  }
}
namespace Windows.Management.Deployment {
  public sealed class PackageManager {
    IAsyncOperationWithProgress<DeploymentResult, DeploymentProgress> RequestAddPackageAsync(Uri packageUri, IIterable<Uri> dependencyPackageUris, DeploymentOptions deploymentOptions, PackageVolume targetVolume, IIterable<string> optionalPackageFamilyNames, IIterable<Uri> relatedPackageUris, IIterable<Uri> packageUrisToInstall);
  }
}
namespace Windows.Media.Audio {
  public sealed class AudioGraph : IClosable {
    IAsyncOperation<CreateMediaSourceAudioInputNodeResult> CreateMediaSourceAudioInputNodeAsync(MediaSource mediaSource);
    IAsyncOperation<CreateMediaSourceAudioInputNodeResult> CreateMediaSourceAudioInputNodeAsync(MediaSource mediaSource, AudioNodeEmitter emitter);
  }
  public sealed class AudioGraphSettings {
    double MaxPlaybackSpeedFactor { get; set; }
  }
  public sealed class CreateMediaSourceAudioInputNodeResult
  public sealed class MediaSourceAudioInputNode : IAudioInputNode, IAudioInputNode2, IAudioNode, IClosable
  public enum MediaSourceAudioInputNodeCreationStatus
}
namespace Windows.Media.Capture {
  public sealed class CapturedFrame : IClosable, IContentTypeProvider, IInputStream, IOutputStream, IRandomAccessStream, IRandomAccessStreamWithContentType {
    BitmapPropertySet BitmapProperties { get; }
    CapturedFrameControlValues ControlValues { get; }
  }
}
namespace Windows.Media.Capture.Frames {
  public sealed class AudioMediaFrame
  public sealed class MediaFrameFormat {
    AudioEncodingProperties AudioEncodingProperties { get; }
  }
  public sealed class MediaFrameReference : IClosable {
    AudioMediaFrame AudioMediaFrame { get; }
  }
  public sealed class MediaFrameSourceController {
    AudioDeviceController AudioDeviceController { get; }
  }
  public enum MediaFrameSourceKind {
    Audio = 4,
  }
}
namespace Windows.Media.Core {
  public sealed class MediaBindingEventArgs {
    void SetDownloadOperation(DownloadOperation downloadOperation);
  }
  public sealed class MediaSource : IClosable, IMediaPlaybackSource {
    DownloadOperation DownloadOperation { get; }
    public static MediaSource CreateFromDownloadOperation(DownloadOperation downloadOperation);
  }
}
namespace Windows.Media.Devices {
  public sealed class VideoDeviceController : IMediaDeviceController {
    VideoTemporalDenoisingControl VideoTemporalDenoisingControl { get; }
  }
  public sealed class VideoTemporalDenoisingControl
  public enum VideoTemporalDenoisingMode
}
namespace Windows.Media.DialProtocol {
  public sealed class DialReceiverApp {
    IAsyncOperation<string> GetUniqueDeviceNameAsync();
  }
}
namespace Windows.Media.MediaProperties {
  public static class MediaEncodingSubtypes {
    public static string P010 { get; }
  }
  public enum MediaPixelFormat {
    P010 = 2,
  }
}
namespace Windows.Media.Playback {
  public sealed class MediaPlaybackSession {
    MediaRotation PlaybackRotation { get; set; }
    MediaPlaybackSessionOutputDegradationPolicyState GetOutputDegradationPolicyState();
  }
  public sealed class MediaPlaybackSessionOutputDegradationPolicyState
  public enum MediaPlaybackSessionVideoConstrictionReason
}
namespace Windows.Media.Streaming.Adaptive {
  public sealed class AdaptiveMediaSourceDiagnosticAvailableEventArgs {
    string ResourceContentType { get; }
    IReference<TimeSpan> ResourceDuration { get; }
  }
  public sealed class AdaptiveMediaSourceDownloadCompletedEventArgs {
    string ResourceContentType { get; }
    IReference<TimeSpan> ResourceDuration { get; }
  }
  public sealed class AdaptiveMediaSourceDownloadFailedEventArgs {
    string ResourceContentType { get; }
    IReference<TimeSpan> ResourceDuration { get; }
  }
  public sealed class AdaptiveMediaSourceDownloadRequestedEventArgs {
    string ResourceContentType { get; }
    IReference<TimeSpan> ResourceDuration { get; }
  }
}
namespace Windows.Networking.BackgroundTransfer {
  public sealed class DownloadOperation : IBackgroundTransferOperation, IBackgroundTransferOperationPriority {
    void MakeCurrentInTransferGroup();
  }
  public sealed class UploadOperation : IBackgroundTransferOperation, IBackgroundTransferOperationPriority {
    void MakeCurrentInTransferGroup();
  }
}
namespace Windows.Networking.Connectivity {
  public sealed class CellularApnContext {
    string ProfileName { get; set; }
  }
  public sealed class ConnectionProfileFilter {
    IReference<Guid> PurposeGuid { get; set; }
  }
  public sealed class WwanConnectionProfileDetails {
    WwanNetworkIPKind IPKind { get; }
    IVectorView<Guid> PurposeGuids { get; }
  }
  public enum WwanNetworkIPKind
}
namespace Windows.Networking.NetworkOperators {
  public sealed class MobileBroadbandAntennaSar {
    public MobileBroadbandAntennaSar(int antennaIndex, int sarBackoffIndex);
  }
  public sealed class MobileBroadbandModem {
    IAsyncOperation<MobileBroadbandPco> TryGetPcoAsync();
  }
  public sealed class MobileBroadbandModemIsolation
  public sealed class MobileBroadbandPco
  public sealed class MobileBroadbandPcoDataChangeTriggerDetails
}
namespace Windows.Networking.Sockets {
  public sealed class ServerMessageWebSocket : IClosable
  public sealed class ServerMessageWebSocketControl
  public sealed class ServerMessageWebSocketInformation
  public sealed class ServerStreamWebSocket : IClosable
  public sealed class ServerStreamWebSocketInformation
}
namespace Windows.Networking.Vpn {
  public sealed class VpnNativeProfile : IVpnProfile {
    string IDi { get; set; }
    VpnPayloadIdType IdiType { get; set; }
    string IDr { get; set; }
    VpnPayloadIdType IdrType { get; set; }
    bool IsImsConfig { get; set; }
    string PCscf { get; }
  }
  public enum VpnPayloadIdType
}
namespace Windows.Security.Authentication.Identity.Provider {
  public enum SecondaryAuthenticationFactorAuthenticationMessage {
    CanceledByUser = 22,
    CenterHand = 23,
    ConnectionRequired = 20,
    DeviceUnavaliable = 28,
    MoveHandCloser = 24,
    MoveHandFarther = 25,
    PlaceHandAbove = 26,
    RecognitionFailed = 27,
    TimeLimitExceeded = 21,
  }
}
namespace Windows.Services.Maps {
  public sealed class MapRouteDrivingOptions {
    IReference<DateTime> DepartureTime { get; set; }
  }
}
namespace Windows.System {
  public sealed class AppActivationResult
  public sealed class AppDiagnosticInfo {
    IAsyncOperation<AppActivationResult> ActivateAsync();
  }
  public sealed class AppResourceGroupInfo {
    IAsyncOperation<bool> TryResumeAsync();
    IAsyncOperation<bool> TrySuspendAsync();
    IAsyncOperation<bool> TryTerminateAsync();
  }
}
namespace Windows.System.Diagnostics.DevicePortal {
  public sealed class DevicePortalConnection {
    ServerMessageWebSocket GetServerMessageWebSocketForRequest(HttpRequestMessage request);
    ServerMessageWebSocket GetServerMessageWebSocketForRequest(HttpRequestMessage request, SocketMessageType messageType, string protocol);
    ServerMessageWebSocket GetServerMessageWebSocketForRequest(HttpRequestMessage request, SocketMessageType messageType, string protocol, uint outboundBufferSizeInBytes, uint maxMessageSize, MessageWebSocketReceiveMode receiveMode);
    ServerStreamWebSocket GetServerStreamWebSocketForRequest(HttpRequestMessage request);
    ServerStreamWebSocket GetServerStreamWebSocketForRequest(HttpRequestMessage request, string protocol, uint outboundBufferSizeInBytes, bool noDelay);
  }
  public sealed class DevicePortalConnectionRequestReceivedEventArgs {
    bool IsWebSocketUpgradeRequest { get; }
    IVectorView<string> WebSocketProtocolsRequested { get; }
    Deferral GetDeferral();
  }
}
namespace Windows.System.RemoteSystems {
  public static class KnownRemoteSystemCapabilities {
    public static string NearShare { get; }
  }
}
namespace Windows.System.UserProfile {
  public static class GlobalizationPreferences {
    public static GlobalizationPreferencesForUser GetForUser(User user);
  }
  public sealed class GlobalizationPreferencesForUser
}
namespace Windows.UI.Composition {
  public class CompositionLight : CompositionObject {
    bool IsEnabled { get; set; }
  }
  public sealed class Compositor : IClosable {
    string Comment { get; set; }
  }
  public sealed class PointLight : CompositionLight {
    Vector2 AttenuationCutoff { get; set; }
  }
  public sealed class SpotLight : CompositionLight {
    Vector2 AttenuationCutoff { get; set; }
  }
}
namespace Windows.UI.Composition.Core {
  public sealed class CompositorController : IClosable
}
namespace Windows.UI.Xaml {
  public sealed class BringIntoViewOptions {
    IReference<double> HorizontalAlignmentRatio { get; set; }
    Point Offset { get; set; }
    IReference<double> VerticalAlignmentRatio { get; set; }
  }
  public sealed class BringIntoViewRequestedEventArgs : RoutedEventArgs
  public sealed class EffectiveViewportChangedEventArgs
  public class FrameworkElement : UIElement {
    event TypedEventHandler<FrameworkElement, EffectiveViewportChangedEventArgs> EffectiveViewportChanged;
    void InvalidateViewport();
    virtual bool IsViewport();
  }
  public class UIElement : DependencyObject {
    public static RoutedEvent BringIntoViewRequestedEvent { get; }
    KeyboardAcceleratorPlacementMode KeyboardAcceleratorPlacementMode { get; set; }
    public static DependencyProperty KeyboardAcceleratorPlacementModeProperty { get; }
    DependencyObject KeyboardAcceleratorToolTipTarget { get; set; }
    public static DependencyProperty KeyboardAcceleratorToolTipTargetProperty { get; }
    DependencyObject KeyTipTarget { get; set; }
    public static DependencyProperty KeyTipTargetProperty { get; }
    event TypedEventHandler<UIElement, BringIntoViewRequestedEventArgs> BringIntoViewRequested;
    virtual void OnBringIntoViewRequested(BringIntoViewRequestedEventArgs e);
    virtual void OnKeyboardAcceleratorInvoked(KeyboardAcceleratorInvokedEventArgs args);
  }
}
namespace Windows.UI.Xaml.Automation.Peers {
  public sealed class AutoSuggestBoxAutomationPeer : FrameworkElementAutomationPeer, IInvokeProvider {
    void Invoke();
  }
}
namespace Windows.UI.Xaml.Controls {
  public class AppBarButton : Button, ICommandBarElement, ICommandBarElement2 {
    string KeyboardAcceleratorText { get; set; }
    public static DependencyProperty KeyboardAcceleratorTextProperty { get; }
    AppBarButtonTemplateSettings TemplateSettings { get; }
  }
  public class AppBarToggleButton : ToggleButton, ICommandBarElement, ICommandBarElement2 {
    string KeyboardAcceleratorText { get; set; }
    public static DependencyProperty KeyboardAcceleratorTextProperty { get; }
    AppBarToggleButtonTemplateSettings TemplateSettings { get; }
  }
  public class MenuFlyoutItem : MenuFlyoutItemBase {
    string KeyboardAcceleratorText { get; set; }
    public static DependencyProperty KeyboardAcceleratorTextProperty { get; }
    MenuFlyoutItemTemplateSettings TemplateSettings { get; }
  }
  public class NavigationView : ContentControl {
    string PaneTitle { get; set; }
    public static DependencyProperty PaneTitleProperty { get; }
    event TypedEventHandler<NavigationView, object> PaneClosed;
    event TypedEventHandler<NavigationView, NavigationViewPaneClosingEventArgs> PaneClosing;
    event TypedEventHandler<NavigationView, object> PaneOpened;
    event TypedEventHandler<NavigationView, object> PaneOpening;
  }
  public sealed class NavigationViewPaneClosingEventArgs
  public enum WebViewPermissionType {
    Screen = 5,
  }
}
namespace Windows.UI.Xaml.Controls.Maps {
  public sealed class MapControl : Control {
    string Region { get; set; }
    public static DependencyProperty RegionProperty { get; }
  }
  public class MapElement : DependencyObject {
    bool IsEnabled { get; set; }
    public static DependencyProperty IsEnabledProperty { get; }
  }
}
namespace Windows.UI.Xaml.Controls.Primitives {
  public sealed class AppBarButtonTemplateSettings : DependencyObject
  public sealed class AppBarToggleButtonTemplateSettings : DependencyObject
  public sealed class MenuFlyoutItemTemplateSettings : DependencyObject
}
namespace Windows.UI.Xaml.Input {
  public sealed class KeyboardAcceleratorInvokedEventArgs {
    KeyboardAccelerator KeyboardAccelerator { get; }
  }
  public enum KeyboardAcceleratorPlacementMode
}

The post Windows 10 SDK Preview Build 17035 now available appeared first on Building Apps for Windows.

MSVC conformance improvements in Visual Studio 2017 version 15.5

$
0
0

The MSVC toolset included in Visual Studio version 15.5 preview 4 includes many C++ conformance improvements. Throughout the VS2015 and VS2017 releases we’ve focused on conformance with C++ standards, including C++17 features. With VS2017 version 15.5, MSVC has implemented about 75% of C++17 core language and library features. These features can be enabled by using the /std:c++17 version switch.

  • Notable new features in the compiler include:
    • Structured bindings
    • constexpr lambdas
    • if constexpr
    • Inline variables
    • Fold expressions
    • Addition of noexcept to the type system
  • Notable changes to our implementation of the Standard Library include:
    • not_fn()
    • Rewording enable_shared_from_this
    • Splicing Maps and Sets
    • Removing Allocator Support in std::function
    • shared_ptr<T[]>, shared_ptr<T[N]>
    • Inline Variables for the STL
    • Removal of Dynamic Exception Specifications
    • Deprecating shared_ptr::unique()
    • Deprecating <codecvt>
    • Deprecating Vestigial Library Parts

We’ve also made significant progress on fixing older conformance issues including expression SFINAE fixes, constexpr completeness, and the majority of two-phase name lookup cases.

Lastly, with regards to current Technical Specifications, we’ve made syntax changes as requested by the C++ Standards Committee. The MSVC compiler in VS2017 version 15.5 implements the syntax change to module interfaces that were requested at the summer meeting.

You now add export in the declaration of a module interface. For example, you write this

      export module FileIO;
      export File OpenFile(const Path&);

to declare the module interface of FileIO where you would have previously written

      module FileIO;
      export File OpenFile(const Path&);

More complete information about C++ conformance improvements in MSVC can be found on docs.microsoft.com and in an upcoming blog post from Stephan T. Lavavej.

Conformance mode on-by-default for new projects.

We’ve enabled the /permissive- conformance mode by default with new projects created in Visual C++, enabling you to write code that is much closer to C++ standards conformance. This mode disables non-conforming C++ constructs that have existed in MSVC for years. You can learn more about our conformance mode and the /permissive- switch that controls it in this blog post or on docs.microsoft.com.

To enable Conformance mode in an older project (or disable it in a new project), just change the Project Properties > C/C++ > Language > Conformance mode setting:

If you’re building on a machine without Visual Studio installed and need to edit the .vcxproj directly, you’ll find the setting is controlled by the ConformanceMode tag. Remember that this tag exists once for each platform configuration in your project. Here’s an example from a .vcxproj file:

  <ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'">
    <ClCompile>
      <WarningLevel>Level3</WarningLevel>
      <Optimization>Disabled</Optimization>
      <SDLCheck>true</SDLCheck>
      <ConformanceMode>true</ConformanceMode>
    </ClCompile>
  </ItemDefinitionGroup>

The conformance mode is compatible with almost all header files from the latest Windows Kits, starting with the Windows Fall Creators SDK (10.0.16299.0). Individual conformance features can be controlled by the finer-grained /Zc conformance switches.

MSVC toolset version number increases to 14.12

Because of the number of conformance improvements and bug fixes included in the MSVC toolset that ships with VS2017 version 15.5, we’re increasing the version number from 14.11 to 14.12. This minor version bump indicates that the VS2017 MSVC toolset is binary compatible with the VS2015 MSVC toolset, enabling an easier upgrade for VS2015 users.

VS2017 version 15.5 includes the third significant update to the MSVC toolset in VS2017. The first update released with VS2017 RTW. The second update came with update version 15.3. For reference, here are the MSVC toolset versions and compiler versions (__MSC_VER) in each release of VS2015 to VS2017. (Note that for historical reasons the MSVC compiler version is 5 higher than the MSVC toolset version displayed in Visual Studio.)

Visual Studio Version MSVC Toolset Version MSVC compiler version (__MSC_VER)
VS2015 and updates 1, 2, & 3 v140 in VS; version 14.00 1900
VS2017, version 15.1 & 15.2 v141 in VS; version 14.10 1910
VS2017, version 15.3 & 15.4 v141 in VS; version 14.11 1911
VS2017, version 15.5 v141 in VS; version 14.12 1912

In closing

Try out the MSVC compiler in Visual Studio version 15.5 Preview 4 and let us know what you think! As always, we can be reached via the comments below, via email (visualcpp@microsoft.com) and you can provide feedback via Help > Report A Problem in the product, or via Developer Community. You can also find us on Twitter (@VisualC) and Facebook (msftvisualcpp).

Release Gates: Releases with continuous monitoring built in

$
0
0
Continuous monitoring is an integral part of safe deployments and DevOps pipelines. Ensuring the app in a release is healthy after deployment is as critical as the success of the deployment process. Enterprises adopt various tools for automatic detection of app health in production, and for keeping track of customer reported incidents. Manual approvals are frequently used... Read More

Azure Networking updates for Fall 2017

$
0
0

At September’s Ignite 2017, our announcements focused on the fundamental pillars of security, performance, monitoring, connectivity, and availability. At Ignite we learned from thousands of customers building sophisticated virtual networks to support their mission critical applications. We’ve enhanced our services putting a special emphasis on simplifying their management. We continue to expand the regions where the new offerings are available. Here is a short summary with more details below.

  • Vnet Service Endpoints preview in all public regions
  • DDoS protection now in US, Europe, and Asia regions
  • Service Tags Preview in all public regions
  • Application Security Groups Augmented Rules Preview in all public regions
  • Data Plane Developer Kit now in Preview
  • Network Performance Monitor for ExpressRoute Preview
  • Azure Monitor and Resource Health for ExpressRoute GA
  • Traffic View visualizations in Portal
  • Point-to-Site VPN for Macs and AD Authentication GA
  • Azure DNS CAA Records and IPv6 support GA

Security

VNet Service Endpoints preview now available in all public regions

Virtual Network Service Endpoints extend your virtual network private address space and the identity of your VNet to Azure services to secure Azure services such as Storage and SQL Database which have Internet facing IP addresses. Service Endpoints for Azure Storage and Azure SQL Database are now available in preview in all regions in Azure public cloud. We will be including additional Azure services to VNet Service Endpoints in the coming months. For more information see VNet Service Endpoints.

image

VNet Service Endpoints restricts Azure services to be accessed only from a VNet

DDoS Protection Preview now in US, Europe, and Asia

The new Azure DDoS Protection service helps protect your application from targeted DDoS attacks and provides additional configuration, alerting and telemetry.  Continuous and automatic tuning helps protect your publicly accessible resources in a VNet. By profiling your application’s normal traffic patterns using sophisticated machine learning algorithms to intelligently detect malicious traffic, targeted DDoS attacks are mitigated. Azure DDoS Protection is now available for preview in select regions in US, Europe, and Asia. For details see DDoS Protection.

image
Azure DDoS Protection helps protect publicly accessible resources in a VNet

Simplifying Networking Security Management

Network Security Groups (NSGs) allow you to define network security access policies based on IP addresses controlling access to and from VMs and subnets in your VNet. Managing complex security policies using only IP addresses can be cumbersome and error-prone. We simplified the management of NSGs with Service Tags, Application Security Groups and enhanced NSG rule capabilities.

image

Simpler Network Security Group management with tags, groups, and enhanced rules

Service Tags Preview now in all public cloud regions

Previously a VNet requiring access to services such as Storage had to allow access to ranges of Azure public IP addresses. Maintaining these IP address ranges was problematic. A service tag simplifies this management task by using a symbolic name to represent all the IP addresses for a given Azure service, either globally or regionally. For example, the service tag named Storage represents all the Azure Storage IP addresses. You can use service tags in NSG rules to allow or deny traffic to a specific Azure service by name. The underlying IP addresses for the tag are automatically updated. Service Tags for Storage, SQL, and Traffic Manager are available in preview in all regions in Azure public cloud with more services coming soon. For more information see Service Tags.

Network Security Group Augmented Rules GA in all regions

Augmented Rules for Network Security Groups simplify security definitions. You can define larger, more complex network security policies with fewer rules that are more easily maintained. Multiple ports, multiple explicit IP addresses, Service Tags and Application Security Groups can all be combined into a single easily understood security rule. This can allow multiple internet clients to access your website using a single NSG rule rather than having an NSG rule per client. Network Security Group Augmented Rules is now GA released in all regions in Azure public cloud. For more details see NSG Augmented Rules.

Performance - Azure remains the fastest public cloud

Azure Accelerated Networking

In Azure Networking, we are continually pushing to provide the best performance for our customers. At Ignite, we announced the fastest VMs offered for the public cloud with 30 Gbps of VM-to-VM throughput. Customers can now easily enable this feature on several our VM instances sizes and distros. For more details see Azure Accelerated Networking.

Data Plane Developer Kit preview availability

At Ignite, we announced support for the Data Plane Developer Kit (DPDK). DPDK enables higher performance for network intensive applications such as network appliances that demand the most efficient and highest rate packet processing available. We are announcing the public preview of DPDK in all Accelerated Networking VMs in Canada East. This is available now for customers to evaluate. To learn more about DPDK, please contact us at AzureDPDK@microsoft.com

Monitoring

Customers entrust Azure to run their mission critical workloads. Providing deep operational insights into the real-time behavior of these production applications is essential. We continue to enhance our network monitoring capabilities to address customers’ needs.

Network Performance Monitor for ExpressRoute now preview

Network Performance Monitor (NPM) is an end-to-end monitoring solution for ExpressRoute, from on-premises to Microsoft. Customers can monitor latency, packet loss, bandwidth between on-premises and their VNets and set threshold based alerts using these metrics and visualize their ExpressRoute topology. With Service endpoint monitoring, customers can track reachability of Office 365 and PaaS services such as Azure Storage. Network Performance Monitor for ExpressRoute Preview is now available in East US, South East Asia and West Europe regions. For more details see Network Performance Monitor for ExpressRoute.

image

 

ExpressRoute Network Performance Monitoring over multiple paths

Azure Monitor and Resource Health for ExpressRoute Circuits GA

We are announcing the General Availability of Azure Monitor for ExpressRoute. Customers can see ExpressRoute Circuit total throughput in Azure Monitor on the portal.

We are also announcing Resource Health for ExpressRoute Circuits. Customers can check for known issues with their ExpressRoute Circuits and see any relevant troubleshooting steps before submitting a support ticket.

image

Throughput per ExpressRoute circuit with Azure Monitor

Visualize your traffic patterns using Traffic View

Azure Traffic Manager Traffic View provides customers actionable intelligence about their users and assist with selecting Azure regions for better performance and reduced latency. We are adding powerful visualizations of the Traffic View data providing a heatmap of users' experience, allowing customers to understand traffic patterns at a global level and zoom in on specific geographies. For more information please see Traffic View Overview.

image

Traffic pattern for an endpoint

Connectivity

Point-to-Site VPN for Macs and AD Authentication now GA

We are announcing the general availability of Point-to-Site (P2S) VPN for macOS as well as Active Directory (AD) Domain authentication for P2S VPN. Customers can connect to Azure Virtual Networks over P2S VPN from their macOS devices using the native IKEv2 VPN client. SSTP continues to be the P2S solution for Windows. Customers can support a mixed client environment consisting of both Windows and macOS by enabling both IKEv2 and SSTP VPN.

image

P2S VPN connects Windows and macOS machines to Azure VNets

To simplify authentication customers can use their organization domain credentials for IKEv2 and SSTP VPN authentication by enabling RADIUS authentication. The Azure VPN Gateway integrates with the customer's RADIUS and AD Domain deployment in Azure or their on-premises datacenter. RADIUS servers integrate with other identity providers providing multiple authentication options (including multi-factor) for P2S VPN.

image

AD Domain Authentication for P2S VPN

Azure provides customers the VPN client configuration required for the native VPN clients on Windows and macOS to connect to Azure. Both P2S VPN for Macs and AD Domain Authentication are available on all Gateway SKUs except the Basic SKU. To learn more about Azure P2S VPN visit our Point-to-Site VPN page.

Azure DNS CAA Record Support and IPv6 Nameservers now GA

We are pleased to announce general availability for support for IPv6 DNS queries and support for Certificate Authority Authorization (CAA) Records. The Certification Authority Authorization (CAA) resource record allows domain owners to specify one or more Certification Authorities (CAs) that are authorized to issue certificates for their domains. Checking for CAA records as part of the certificate issuance process is now mandatory for CAs. This defense-in-depth security feature allows CAs to reduce risk of unintended certificate mis-issue. To learn more, see this post with more details about Azure DNS support for CAA Records and IPv6.

Summary

We will continue working to simplify the overall network management, security, scalability, availability and performance of your mission-critical applications. With Azure Networking, you can fully reap the benefits offered by our cloud and our global backbone network. There will be more updates and announcements in the coming months. This is an exciting time for all of us as the cloud continues to evolve. We welcome your continued feedback on the features we released at Ignite as well as these new features and capabilities to ensure our roadmap meets your requirements. 

Azure DNS Updates – CAA Record Support and IPv6 Nameservers

$
0
0

We are pleased to announce a couple of updates to Azure DNS that have been long awaited by our customers:

  • Support for Certificate Authority Authorization (CAA) Records
  • IPv6 Nameservers

Support for CAA Records

The Certification Authority Authorization (CAA) resource record allows domain owners to specify one or more Certification Authorities (CAs) that are authorized to issue certificates for their domains. Checking for CAA records as part of the certificate issuance process is now mandatory for CAs. This defense-in-depth security feature allows CAs to reduce risk of unintended certificate mis-issue.

The CAA record type supports three properties:

  • Flags: An unsigned integer between 0 – 255, used to represent the critical flag that has a specific meaning per the RFC
  • Tag: An ASCII string which could be one of the following:
    • Issue: allows domain owners to specify CAs that are permitted to issue certificates (all types)
    • Issuewild: allows domain owners to specify CAs that are permitted to issue certificates (wildcard certs only)
    • Iodef: allows domain owners to specify an email address or hostname to which CAs can notify for certificate issuance requests for domain otherwise not authorized via CAA records
  • Value: the value associated with the specific tag used

You can create and manage CAA records via the Azure REST API, PowerShell and CLI.

IPv6 Nameservers

Over the past few years there has been significant growth not only in DNS clients supporting IPv6, but also with recursive resolvers supporting domain name querying over IPv6. We are pleased to announce that Azure DNS nameservers now support queries over IPv6, in addition to IPv4 as before.

As an example, the below Portal screen capture illustrates how you can find the Azure DNS name servers for your DNS zone hosted in Azure DNS.

dnsnameservers

You can find the IPv6 address for each of those name servers using a command such as nslookup.

image

As always, we love getting our customers’ feedback. We have a Azure Feedback channel to receive suggestions for features and future supported scenarios. We encourage you to browse what others are suggesting, vote for your favorites, and enter suggestions of your own.

Please review our public FAQ for answers to the most frequent questions.

Announcing General Availability of Azure Reserved VM Instances (RIs)

$
0
0

Azure Reserved VM Instances (RIs) are generally available for customers worldwide, effective today. Azure RIs enable you to reserve Virtual Machines on a one- or three-year term, and provide up to 72% cost savings versus pay-as-you-go prices.

Azure RIs give you price predictability and help improve your budgeting and forecasting. Azure RIs also provide unprecedented flexibility should your business needs change. We’ve made it easy to exchange your RIs and make changes such as region or VM family, and unlike other cloud providers, you can cancel Azure RIs at any time and get a refund.

RI_1

Azure is the most cost-effective cloud for Windows Server workloads

If you are a Windows Server customer with Software Assurance, you can combine Azure RIs with Azure Hybrid Benefits and save up to 82% compared to pay-as-you-go prices, and up to 67%* compared to AWS RIs for Windows VMs. In addition, with Azure Hybrid Benefit for SQL Server, customers with Software Assurance will be able to save even more.

With Azure RIs and Azure Hybrid Benefit, Azure is the most cost-effective public cloud to run your Windows Server workloads!

Azure offers better than per-second billing and free cost management tools

Azure bills you per-second rounded down to the last minute, saving you money and simplifying your bill. For example, a VM that runs for 345 seconds is billed at 300 seconds. And, Azure offers you built-in, easy-to-use controls to schedule VM auto-shutdown. This means that you can save money by shutting down VMs when you don’t need them. Examples include Dev/Test environments, Big Data analysis, and other batch operations.

Additionally, with free Azure Cost Management, you can further optimize your cloud resources, manage departmental budgets, and allocate costs.  Today, you can already right-size virtual machines based on real-time usage reports. You’ll soon be able to visualize the cost benefits of purchasing RIs compared to pay-as-you-go and understand the utilization of your existing RIs and other instances over time, as shown in the screenshot below:

RI_2

The combination of these capabilities, our price-match commitment, and product innovations like Burstable VMs and Archival Storage, means that Azure offers the best deal for moving to and innovating in the cloud.

Learn more at azure.microsoft.com/pricing.

*Based on comparing 3-year Azure D8_v3 RI prices with Azure Hybrid Benefit in US West 2 to AWS m4.2xlarge 3-year Standard RIs in US West (Oregon). Actual savings may vary based on region, instance type, usage or software license costs.

Cloud-hosted Mac agents for CI/CD pipelines

$
0
0
Removing barriers to DevOps in the cloud Teams developing software for Apple devices have limited options when migrating to the cloud. Because such apps must be built on Macs, and because there are few cloud-hosted Mac offerings, many teams are forced to provide their own Mac hardware for CI/CD while the rest of their DevOps... Read More

Preview the new Azure Storage SDK for Go & Storage SDKs roadmap

$
0
0

We are excited to announce the new and redesigned Azure Storage SDK for Go with documentation and examples available today. This new SDK was redesigned to follow the next generation design philosophy for Azure Storage SDKs and is built on top of code generated by AutoRest, an open source code generator for the OpenAPI specification.

Azure Storage Client Libraries have evolved dramatically in the past few years, supporting many development languages from C++ to JavaScript. As we strive to support all platforms and programming languages, we have decided to use AutoRest to accelerate delivering new features in more languages. In the Storage SDK for Go, AutoRest produces what we call the protocol layer. The new SDK uses this internally to communicate with the Storage service.

Currently, the new Azure Storage SDK for Go supports only Blob storage, but we will be releasing Files and Queue support in the future. Note that all these services will be packaged separately, something we started doing recently with the Azure Storage SDK for Python. Expect to see split packages for all the Storage Client Libraries in the future. This reduces the library footprint dramatically if you only use one of the Storage services.

What's new?

  • Layered SDK architecture: Low-level and high-level APIs
  • There is one SDK package per service (REST API) version for greater flexibility. You can choose to use an older REST API version and avoid breaking your application. In addition, you can load multiple packages side by side.
  • Built on code generated by AutoRest
  • Split packages for Blob, Files and Queue services for reduced footprint
  • A new extensible middleware pipeline to mutate HTTP requests and responses
  • Progress notification for upload and download operations

Layered SDK architecture

The new Storage SDK for Go consists of 3 layers that simplifies the programming experience, as well as improves the debuggability. The first layer is the auto generated layer that consists of private classes and functions. You will be able to generate this layer using AutoRest in the future. Stay tuned!

The second layer is a stateless, thin wrapper that maps one-to-one to the Azure Storage REST API operations. As an example, a BlobURL object offers methods like PutBlob, PutBlock, and PutBlockList. Calling one of these APIs will result in a single REST API request, as well as a number of retries in case the first REST call fails.

The third layer consists of high level abstractions for your convenience. One example is UploadBlockBlobFromStream which will call a number of PutBlock operations depending on the size of the stream being uploaded.

Hello World

Following is a Hello World example using the new Storage SDK for Go. Check out the full example on GitHub.

// From the Azure portal, get your Storage account's name and key and set environment variables.
accountName, accountKey := os.Getenv("ACCOUNT_NAME"), os.Getenv("ACCOUNT_KEY")

// Use your Storage account's name and key to create a credential object.
 credential := azblob.NewSharedKeyCredential(accountName, accountKey)

// Create a request pipeline that is used to process HTTP(S) requests and responses.
p := azblob.NewPipeline(credential, azblob.PipelineOptions{})

 // Create an ServiceURL object that wraps the service URL and a request pipeline.
 u, _ := url.Parse(fmt.Sprintf("https://%s.blob.core.windows.net", accountName))
serviceURL := azblob.NewServiceURL(*u, p)

 // All HTTP operations allow you to specify a Go context.Context object to
// control cancellation/timeout.
ctx := context.Background() // This example uses a never-expiring context.

// Let's create a container
fmt.Println("Creating a container named 'mycontainer'")
 containerURL := serviceURL.NewContainerURL("mycontainer")
 _, err := containerURL.Create(ctx, azblob.Metadata{}, azblob.PublicAccessNone)
if err != nil { // An error occurred
   if serr, ok := err.(azblob.StorageError); ok { // This error is a Service-specific
      switch serr.ServiceCode() { // Compare serviceCode to ServiceCodeXxx constants
         case azblob.ServiceCodeContainerAlreadyExists:
            fmt.Println("Received 409. Container already exists")
            break
         default:
            // Handle other errors ...
            log.Fatal(err)
      }
   }
}

// Create a URL that references a to-be-created blob in your
// Azure Storage account's container.
blobURL := containerURL.NewBlockBlobURL("HelloWorld.txt")

// Create the blob with string (plain text) content.
data := "Hello World!"
putResponse, err := blobURL.PutBlob(ctx, strings.NewReader(data),
   azblob.BlobHTTPHeaders{ContentType: "text/plain"}, azblob.Metadata{},
   azblob.BlobAccessConditions{})
if err != nil {
   log.Fatal(err)
}
fmt.Println("Etag is " + putResponse.ETag())

Progress reporting with the new Pipeline package

One of the frequently asked features for all Storage Client Libraries have been the ability to track transfer progress in bytes. This is now available in the Storage SDK for Go. Here is an example:

// From the Azure portal, get your Storage account's name and key and set environment variables.
accountName, accountKey := os.Getenv("ACCOUNT_NAME"), os.Getenv("ACCOUNT_KEY")

// Create a request pipeline using your Storage account's name and account key.
credential := azblob.NewSharedKeyCredential(accountName, accountKey)
p := azblob.NewPipeline(credential, azblob.PipelineOptions{})

// From the Azure portal, get your Storage account blob service URL endpoint.
cURL, _ := url.Parse(
   fmt.Sprintf("https://%s.blob.core.windows.net/mycontainer", accountName))

// Create a ContainerURL object that wraps the container URL and a request
// pipeline to make requests.
containerURL := azblob.NewContainerURL(*cURL, p)

ctx := context.Background() // This example uses a never-expiring context
// Here's how to create a blob with HTTP headers and metadata (I'm using
// the same metadata that was put on the container):
blobURL := containerURL.NewBlockBlobURL("Data.bin")

// requestBody is the stream of data to write
requestBody := strings.NewReader("Some text to write")

// Wrap the request body in a RequestBodyProgress and pass a callback function
// for progress reporting.
_, err := blobURL.PutBlob(ctx,
   pipeline.NewRequestBodyProgress(requestBody,
      func(bytesTransferred int64) {
         fmt.Printf("Wrote %d of %d bytes.n", bytesTransferred, requestBody.Len())
      }),
   azblob.BlobHTTPHeaders{
      ContentType: "text/html; charset=utf-8",
             ContentDisposition: "attachment",
   }, azblob.Metadata{}, azblob.BlobAccessConditions{})
if err != nil {
    log.Fatal(err)
}

// Here's how to read the blob's data with progress reporting:
get, err := blobURL.GetBlob(ctx, azblob.BlobRange{}, azblob.BlobAccessConditions{}, false)
if err != nil {
    log.Fatal(err)
}

// Wrap the response body in a ResponseBodyProgress and pass a callback function
// for progress reporting.
responseBody := pipeline.NewResponseBodyProgress(get.Body(),
   func(bytesTransferred int64) {
      fmt.Printf("Read %d of %d bytesn.", bytesTransferred, get.ContentLength())
   })
downloadedData := &bytes.Buffer{}
downloadedData.ReadFrom(responseBody)
// The downloaded blob data is in downloadData's buffer

What's next?

Storage SDK for Go roadmap:

  • General availability
  • Files and Queue packages
  • New Storage features like archive and blob tiers coming soon
  • Convenience features like parallel file transfer for higher throughput

Roadmap for the rest of the Storage SDKs:

  • Split packages for Blob, Files, and Queue services coming soon for .NET and Java, followed by all other Storage Client Libraries
  • OpenAPI (a.k.a Swagger) specifications for AutoRest
  • Completely new asynchronous Java Client Library using the Reactive programming model

Developer survey & feedback

Please let us know how we’re doing by taking the 5-question survey. We are actively working on the features you’ve requested, and the annual survey is one of the easiest ways for you to influence our roadmap!

Sercan Guler and Jeffrey Richter

Viewing all 10804 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>