Quantcast
Channel: Category Name
Viewing all 10804 articles
Browse latest View live

Network Performance Monitor’s Service Connectivity Monitor is now generally available

$
0
0

Network Performance Monitor’s (NPM) Service Connectivity Monitor, previously in preview as Service Endpoint Monitor, is making general availability sporting a new name. With Service Connectivity Monitor, you can now monitor connectivity to services such as applications, URIs, VMs, and network devices, as well as determine what infrastructure is in the path and where network bottlenecks are occurring.

As services and users are becoming more dispersed across clouds, branch offices, and remote geographies, it is becoming more difficult to determine the cause of a service outage or performance degradation. These can be due to an issue with the application, stack, or cluster as well as network issues in the cloud, the carrier network, or in the first-mile. Service Connectivity Monitor integrates the monitoring and visualization of the performance of your internally hosted and cloud services with the end-to-end network performance. You can create connectivity tests from key points in your network to your services and identify whether the problem is due to the network or the application. With the network topology map, you can locate the links and interfaces experiencing high loss and latencies, helping you identify external and internal troublesome network segments.

Determine if it’s an application or a network problem

You can determine whether the application connectivity issue is due to the application or the network by corelating the application response time with the network latency.

The example image below illustrates a scenario where spikes in the application response time are accompanied with corresponding spikes in the network latency. This suggests that the application degradation is due to an increase in network latency, and therefore, the issue is due to the underlying network, and not the application.

Application or network problem

The example snippet below demonstrates another scenario where there is a spike in the application response time whereas the network latency is consistent. This suggests that the network was in a steady state, when the performance degradation was observed. Therefore, the problem is due to an issue at the application end.

Scenario 1

Identify network bottlenecks

You can view all the paths and interfaces between your corporate premises and application endpoint on NPM’s interactive topology map. You not only get end-to-end network visibility from your nodes to the application, but you can also view the latency contributed by each interface to help you identify the troublesome network segment. The image below illustrates a scenario where you can identify the highlighted network interface as the one causing most latency.

Network bottlenecks

Monitor service connectivity from multiple vantage points from a central view

You can monitor connectivity to your services from your branch offices, datacenters, office sites, cloud infrastructure, etc. from a central view. By installing the NPM agents at the vantage points in your corporate perimeter, you can get the performance visibility from where your users are accessing the application.

The below example image illustrates a scenario where you can get the network topology from multiple source nodes to www.msn.com in a single pane of view and identify the nodes with connectivity issues from the unhealthy paths in red.

Service endpoints

Monitor end-to-end connectivity to services

Monitor the total response time, network latency, and packet loss between the source nodes in your corporate perimeter and the services you use, such as websites, SaaS, PaaS, Azure services, file servers, SQL servers, among others. You can setup alerts to get proactively notified whenever the response time, loss, or latency from any of your branch offices crosses the threshold. In addition to viewing the near real-time values and historical trends of the performance data, you can use the network state recorder to go back in time to view a particular network state in order to investigate the difficult-to-catch transient issues.

Tech nodes for Bing

Monitor connectivity to Microsoft services using built-in tests for Microsoft Office 365 and Dynamics 365

Service Connectivity Monitor provides built-in tests that allow a simple one-click setup experience to monitor connectivity to Microsoft’s Office 365 and Dynamics 365 services, without any pre-configuration. Since the capability maintains a list of endpoints associated with these services, you do not have to enter the various endpoints associated with each service. 

Monitoring

Create custom queries and views

All data that is exposed graphically through NPM’s UI are also available natively in Log Analytics search. You can perform interactive analysis of data in the repository, correlate data from different sources, create custom alerts and views, and export the data to Excel, PowerBI, or a shareable link.

Get started

You can find detailed instructions about how to setup Service Conectivity Monitor in NPM and learn more about the other capabilities in NPM.

Provide feedback

There are a few different routes to give feedback:

  • UserVoice: Post new ideas for Network Performance Monitor on our UserVoice page.
  • Join our cohort: We’re always interested in having new customers join our cohorts to get early access to new features and help us improve NPM going forward. If you are interested in joining our cohorts, simply fill out this quick survey.

Azure.Source – Volume 37

$
0
0

Big Data Analytics

Microsoft deepens its commitment to Apache Hadoop and open source analytics - Last week, Hortonworks hosted the DataWorks Summit in San Jose, California, which is billed as the world’s premier big data community event for everything data. This post provides a rollup of Azure HDInsight info and reflects on the progress made since the start of Microsoft's partnership with Hortonworks and contributions made to Apache projects, such as Hadoop, Kafka, and Spark. More details are also available in the four posts listed below.

Top 8 reasons to choose Azure HDInsight - Learn the top eight reasons why enterprises such as Adobe, Jet, ASOS, Schneider Electric, and Milliman are choosing Azure HDInsight for their big data applications.

Azure Data ingestion made easier with Azure Data Factory’s Copy Data Tool - The Copy Data Tool sets up a pipeline to accomplish the data loading task in minutes, without having to understand or explicitly set up Linked Services and datasets for source and destination. Now, it supports all 70+ on-prem and cloud data sources, and we will continue to add more connectors in the coming months.

Event trigger based data integration with Azure Data Factory - This post announces support for event based triggers in your Azure Data Factory (ADF) pipelines, such as file landing or getting deleted in your Azure storage.

Siphon: Streaming data ingestion with Apache Kafka - Learn how Siphon helps drive innovation in Azure HDInsight. Siphon is an internal Microsoft solution that provides a highly available and reliable distributed Data Bus for ingesting, distributing and consuming near real-time data streams for processing and analytics for cloud services of massive scale, such as Bing, Office 365, and Skype.

A simplified view of the Siphon architecture

Now in preview

Azure Service Bus is now offering support for Availability Zones in preview - Azure Availability Zones support for Service Bus Premium provides an industry-leading, financially-backed SLA with fault-isolated locations within an Azure region, providing redundant power, cooling, and networking. This preview begins with Central US, East US 2, and France Central.

Azure Event Hubs is now offering support for Availability Zones in preview - Availability Zones are now in public preview for new Standard Event Hubs namespaces. This preview begins with Central US, East US 2, and France Central.

Immutable storage for Azure Storage Blobs now in public preview - The public preview of immutable storage for Azure Storage blobs addresses requirements for regulated industries to retain business-related communication in an immutable state that makes it non-erasable and non-modifiable for a certain retention interval. The feature is now available in all Azure public regions. Through configurable policies, users can keep Azure Blob storage data in an immutable state where blobs can be created and read, but not modified or deleted.

Resumable Online Index Create is in public preview for Azure SQL DB - Resumable Online Index Create (ROIC) is now available for public preview in Azure SQL DB, which enables the pausing of an index create operation and the ability to resume it later from where the index create operation was paused or failed, rather than having to restart the operation from the beginning.

Azure AD Password Protection and Smart Lockout are now in Public Preview - Azure AD Password Protection helps you eliminate easily guessed passwords from your environment, which can dramatically lower the risk of being compromised by a password spray attack. Smart lockout is our lockout system that uses cloud intelligence to lock out bad actors who are trying to guess your users’ passwords.

Now generally available

Announcing the general availability of Azure SQL Data Sync - Azure SQL Data Sync is a service built on Azure SQL Database that enables you to synchronize the data you select bi-directionally across multiple SQL databases and SQL Server instances. In addition, Azure SQL Data Sync now supports a better configuration experience, more reliable and faster database schema refresh, and more secured data synchronization.

Azure SQL Data Sync

Disaster Recovery solution for Azure IaaS applications - This post builds on the recent announcement for the general availability of Disaster Recovery (DR) for Azure Virtual Machines (VMs) using Azure Site Recovery (ASR) by highlighting the benefits of ASR. Disaster Recovery between Azure regions is available in all Azure regions where ASR is available.

Traffic Analytics now generally available - Azure Traffic Manager provides metrics for monitoring the health status of endpoints in a profile and the amount of traffic that a profile receives: Queries by Endpoint Returned and Endpoint Status by Endpoint. You can also configure and receive alerts based on conditions that interest you. These metrics and alerts are provided via the Azure Monitor service.

The Azure Podcast

The Azure Podcast: Episode 234 - Migrations - Longtime guest on the show, Bill Zack joins us along with colleague Aaron Ebertowski to talk about Azure Migrations.

News and updates

Maven: Deploy Java Apps to Azure with Tomcat on Linux - The Maven Plugin for Azure App Service provides seamless integration of Azure services into Maven projects. In just one step, you can deploy your WAR file to Azure Web Apps on Linux with the built-in running stack of Java 8 and Tomcat 8.5 or 9.0.

Azure Blockchain Workbench 1.1.0 extends capabilities and monitoring - This post announces the first major update to Azure Blockchain Workbench, which released to preview in May. This update includes multi workflow and contract support, monitoring & usability improvements, bug fixes, and more.

Introducing the redesigned Security Center Overview dashboard - The redesigned Security Center dashboard provides cross-subscription, organizational level reports of the most important metrics that influence the organizational security posture, as well providing actionable insights to help organizations improve their overall security posture. The dashboard also now has metrics for Subscription Coverage and Policy Compliance.

Column-Level Security is now supported in Azure SQL Data Warehouse - Column-Level Security (CLS) for Azure SQL Data Warehouse is an additional capability for managing security for sensitive data that places the access restriction logic in the database itself. This capability is available now in all Azure regions with no additional charge.

Location and Maps in Azure IoT Central powered by Azure Maps - Azure IoT Central now leverages Azure Maps, which provides a portfolio of geospatial functionalities natively integrated into Azure to enable users with fresh mapping data necessary to provide geographic context to their location aware IoT applications. This enables Azure IoT Central customers to add maps to the Device dashboard and layer location information using the Azure Maps JavaScript Control Services.

Cost Reporting ARM APIs across subscriptions for EA customers - This month we are releasing ARM-supported APIs for the enrollment hierarchy. This will enable users with the required privileges to make API calls to the individual nodes in the management hierarchy and get the most current cost and usage information.

Additional news and updates

Tuesdays with Corey

More info on Azure Container Instances - Corey Sanders, Corporate VP - Microsoft Azure Compute team sat down with Madhan Ramakrishnan, Partner PM Manager on the Azure Compute Team to talk about ACI (now GA - Azure Container Instances).

Technical content and training

Enabling Smart Manufacturing with Edge Computing - This article introduces Edge Computing and discusses its role in enabling Smart Manufacturing. The service Azure offers to enable Edge Computing is called Azure IoT Edge, which is a fully managed service that delivers cloud intelligence locally by deploying and running artificial intelligence (AI), Azure services, and custom logic directly on cross-platform IoT devices.

Azure tips & tricks

Working with Log Stream and Azure App Service

Setting up Email Alerts with Azure App Service

Events

Dive into blockchain for healthcare with the HIMSS blockchain webinar - At the recent HIMSS18 conference in Las Vegas, David Houlding moderated a panel discussion of worldwide experts on blockchain in healthcare, which was organized by the HIMSS Blockchain Work Group. This post links to a video of that discussion as well as a link to a follow-on on-demand webinar where the same panel of experts shared their direct experience in piloting or preparing to pilot blockchain in healthcare this year.

The IoT Show

Kubernetes integration with Azure IoT Edge - You are using Kubernetes to deploy and manage your containerized applications and wonder about distributing your clusters across traditional nodes and IoT devices? Azure IoT Edge now integrates with Kubernetes allowing you to seamlessly do exactly that! Learn how this works and check out the demo by Venkat Yalla, PM in the Azure IoT team.

Customers and partners

Azure Marketplace new offers: May 16-31 - The Azure Marketplace is the premier destination for all your software needs – certified and optimized to run on Azure. Find, try, purchase, and provision applications & services from hundreds of leading software providers. In the second half of May we published 31 new offers, including: Looker Analytics Platform, Solar inCode, and Kong Certified by Bitnami.

Azure Friday

Azure Friday | Episode 441 - Service Fabric Extension for VS Code - Peter Pogorski chats with Scott Hanselman about building Service Fabric applications with the Service Fabric for VS Code extension. This episode introduces the process of creating and debugging Service Fabric applications with the new Service Fabric extension for VS Code. The extension enables you to create, build, and deploy Service Fabric applications (e.g., C#, Java, Containers, and Guests) to local or remote clusters.

Azure Friday | Episode 442 - Cross Platform for Azure PowerShell - Aaron and Scott check out the latest updates in Azure PowerShell, from simplified scenarios to `-AsJob` backgrounding support. This new functionality is now available on Mac, Linux, and Windows with PowerShell Core 6.

Azure Friday | Episode 443 - Azure CLI Extensions - Aaron and Scott check out the latest extensions for Azure CLI. Azure CLI Extensions provide new, exciting features, and the Alias Extension is the first of many user-centric extensions that make Azure automation simple and easy.

Developer spotlight

Q&A with Kubernetes co-founder Brendan Burns - Stella Lin sat down with Brendan Burns, co-founder of Kubernetes, to get a behind-the-scenes look at the service, as well as what’s new and what’s next with the Kubernetes community.

SmartHotel360 Microservices on Azure Kubernetes Service - To help you learn how to deploy microservices written in any framework to AKS, we updated the SmartHotel360 back-end microservices source code and deployment process to optimize it for AKS. Clone, fork, or download the AKS and Azure Dev Spaces demo on GitHub to learn more.

Tutorial: Deploy, manage, and update applications on an AKS cluster - In this seven-part tutorial, you’ll create container images, upload them to Azure Container Registry, deploy them to an AKS cluster, run your container images, scale the application on AKS, update it while running, and upgrade the AKS cluster.

Monitor Azure Kubernetes Service (AKS) container health (Preview) - Learn how to set up and use Azure Monitor container health to monitor the performance of your workloads deployed to Kubernetes environments hosted on Azure Kubernetes Service.

Tutorial: Use a custom Docker image for Web App for Containers - In this tutorial, you learn how to build a custom Docker image and deploy it to Web App for Containers. This pattern is useful when the built-in images don't include your language of choice, or when your application requires a specific configuration that isn't provided within the built-in images.

Azure This Week from A Cloud Guru

Azure This Week - 22 June 2018 - In this episode of Azure This Week, James takes a look at the public preview announcements for Azure Blob Storage lifecycle management and immutable storage for Azure Storage Blobs, as well as the public preview of Azure AD Password Protection.

New zone-redundant VPN and ExpressRoute gateways now in public preview

$
0
0

As with all of Azure, we are continuously innovating, upgrading, and refining our virtual network gateways to further increase reliability and availability.

Today, we are sharing the public preview of zone-redundant VPN Gateway and ExpressRoute virtual network gateways. By adding support for Azure Availability Zones, we bring increased resiliency, scalability, and higher availability to virtual network gateways.

You can now deploy VPN and ExpressRoute gateways in Azure Availability Zones. This physically and logically separates them into different Availability Zones protecting your on-premises network connectivity to Azure from zone-level failures. Additionally, we have made fundamental performance improvements including reducing the deployment time to create a virtual network gateway.

To automatically deploy your virtual network gateways across availability zones, you can use zone-redundant virtual network gateways.

Your virtual network

Zone-redundant virtual network gateways use specific new gateway SKUs for VPN Gateway and ExpressRoute. To begin using zone-redundant, you can self-enroll your subscription in the public preview. Once you enroll, you will start seeing the new gateway SKUs in all the Azure Availability Zone regions. See the getting started guide for steps to self-enroll, to view information about the new gateway SKUs, and for configuration information.

The new gateway SKUs also support other deployment options to best match your needs. When creating a virtual network gateway using the new gateway SKUs, you also have the option to deploy the gateway in a specific zone. This is referred to as a zonal gateway. When you deploy a zonal gateway, all the instances of the gateway are deployed in the same Availability Zone.

We look forward to you trying out the new gateways, and to receiving your precious feedback!

Chakra documentation is joining MDN web docs

$
0
0

Last fall, along with other browser vendors and web platform stakeholders, the Edge platform (EdgeHTML) team adopted MDN web docs as our official web API reference library. In the process, we redirected over 7700 MSDN pages for HTML, CSS, SVG and DOM APIs to corresponding reference content on MDN and joined the community in helping maintain it, including making over 5000 edits to the browser compatibility tables to reflect the latest Edge API surface

Today the Microsoft Edge JavaScript engine, Chakra, joins the community-wide effort to make MDN web docs the web’s one-stop, premiere development reference, starting by redirecting all 500+ pages of Microsoft’s JavaScript API reference from docs.microsoft.com to their MDN counterparts and porting our legacy JS extensions reference for historical and servicing purposes.

Illustration showing the Edge logo alongside the MDN web docs logo

We want to thank our community for enthusiastically embracing MDN web docs, and special kudos to everyone who helps to improve the docs, edit by edit. In the previous year alone, you helped the MDN web docs community grow monthly users by over 50%! Let’s keep documenting the web forward together.

– Erika Doyle Navara, MDN Product Advisory Board representative for Microsoft Edge

The post Chakra documentation is joining MDN web docs appeared first on Microsoft Edge Dev Blog.

Visual Studio 2017 version 15.8 Preview 3

$
0
0

We’re happy to share the highlights of the latest Visual Studio 2017 preview, which is now available for download, including:

This Preview builds upon the features that debuted in Visual Studio version 15.8 Preview 1 and Preview 2 which were both released last month. As always, you can drill into the details of all of these features by exploring the Visual Studio 2017 version 15.8 Preview release notes.

We hope that you will install and use this Preview, and most importantly, share your feedback with us. To acquire the Preview, you can either install it fresh from here, update your bits directly from the Preview IDE, or if you have an Azure subscription, you can simply provision a virtual machine with this latest Preview. We appreciate your early adoption, engagement, and feedback as it helps us ship the most high-quality tools to everyone in the Visual Studio community.

Please note: A number of great new 15.8 features are debuting in Preview 3, so you’ll find this blog is more detailed than most of our preview posts.

Productivity

C# Code Cleanup: The next time you invoke Format Document (Ctrl+K,D) you will see a yellow bar prompting you to configure your code cleanup. These new settings allow you to go beyond simply reformatting your file by enabling you to remove and sort usings as well as fix coding convention violations to match the preferences in your .editorconfig, or lacking that, your Tools>Options configuration. You can change your Format Document settings at any time by going to Tools > Options > Text Editor > C# > Code Style > Formatting > General and see new .editorconfig options for .NET code style in our documentation.

You can change your Format Document settings at any time by going to Tools > Options > Text Editor > C# > Code Style > Formatting > General

Deferred Extension Loading: Visual Studio has a rich ecosystem of extensions that extend functionality and enhance the development experience. Getting all this functionality loaded, initialized, and running can take time. In order to preserve Visual Studio’s startup responsiveness, cooperating extensions won’t be loaded until after a solution has been opened. The progress bar in the task status center will give you visibility into the state of the loaded extensions. Authors of startup performance friendly extensions (i.e. those who have taken advantage of AsyncPackage so that their extensions can load on the background) should test their extension against Visual Studio 2017 version 15.8 and confirm that everything continues to work after deferred loading. See more details in this blog.

Deffered Extension Loading

Multi-Caret Support: In version 15.8 Preview 3, we’re excited to address one of our top voted UserVoice items: native multi-caret support. With this feature, you’ll be able to create multiple insertion points or selections at arbitrary places in your file or add additional selections that match your current selection, allowing you to add, delete or select text in multiple places at once. For a full list of features included with multi-caret support, look under the Edit > Multiple Carets menu flyout.

Additional Keyboard Mappings: We know many people switch between Visual Studio Code and Visual Studio. To help give you a more consistent set of key mappings as you switch between the two, we’ve introduced a new Visual Studio Code keyboard mapping scheme. We’ve also included a key mapping scheme that matches the ReSharper mappings.  You can find and activate these profiles under Tools > Options > Environment > Keyboard > Mapping Schemes dropdown.

Keyboard mappings

Performance

Project Loading: Visual Studio version 15.8 Preview 3 brings faster project reloads for all types of projects. Benefits are huge – project unload and reload that used to take more than a minute for large solutions now takes just a few seconds. Git branch checkout that changes a project file. Stay tuned for more project load performance improvements in subsequent releases. Please check out the Visual Studio Preview release notes for details on how we made these perf gains.

Showing Data Connections from ASP.NET Web Projects in Server Explorer: While loading ASP.NET Web projects, database connections specified in web.config are automatically added to the “Data Connections” node in the Server Explorer window. Adding these connections, particularly if there are a lot of them, may take a long time to complete. Although we have improved the performance of loading these connections, we have also received feedback from some users that they would like to load them manually at a later point. In this update, we introduced a new setting that allows you to configure and control how and when data connections are loaded. This setting, which is checked by default, can be found under Tools -> Options -> Projects and Solutions -> Web Projects: “Automatically show data connections from web.config in Server Explorer”. If it is unchecked, connections will no longer be automatically loaded. You can add them manually by using the context menu “Add Connections from web.config” on the “Data Connections” node in the Server Explorer.

Performance – Profiling Applications

We have a few notable improvements to highlight in this Preview regarding the CPU Usage Tool. These improvements build upon the CPU Usage Tool features we announced with version 15.8 Preview 2.

Pause/Resume for Collection of CPU Usage data: The CPU Usage tool in the Performance Profiler (ALT-F2) can now start in a paused state, which means that it will not collect any CPU usage sample stack data until it is specifically enabled. This makes the resultant amount of data much smaller to collect and analyze, thus making your performance investigations more efficient. Once you start the target application, a monitoring display will show the CPU utilization graph and will allow you to control the CPU profiling and enable/disable sample data collection as many times as you like.

Pause / Resume Collection of CPU Usage Data

The CPU utilization graph changes color to indicate whether sample collection is enabled/disabled at that point in time.

The .NET Object Allocation Tracking Tool joins the family of tools available from the Performance Profiler (ALT-F2). Invoking this tool for a performance profiler session causes the collection of a stack trace for every .NET object allocation that occurs in the target application. This stack data is analyzed along with object type and size information to reveal details of the memory activity of your application. You can quickly determine the allocation patterns in your code and identify anomalies as well. In addition, for Garbage Collection events, you can easily determine which objects were collected and which were retained, quickly determining object types which dominate the memory usage of your application. This is especially useful for API writers to help minimize allocations. While your test application is executing, the Performance Profiler displays a monitoring view with a line graph of Live Objects (count), as well as an Object Delta (% change) bar graph.

The .NET Object Allocation Tracking Tool

Refer to the Visual Studio Preview release notes for details on how to configure settings for these tools and learn more about how to use them. Please give the CPU Usage Tool and the .NET Object Allocation Tracking a try and send feedback if you have any issues or suggestions.

JavaScript and TypeScript Tooling

There are a lot of improvements to JavaScript and TypeScript tooling in this version 15.8 Preview. Highlights include:

TypeScript 2.9: This release includes TypeScript 2.9 by default, which includes richer IntelliSense for some common JavaScript patterns, several new refactorings, and numerous type system improvements. For full details, see the recent TypeScript 2.9 blog post.

Improved Vue.js support: Support for the Vue.js library has been improved, particularly regarding support for .vue files, aka “single file components”, and enhancements when editing script blocks inside .vue files. Additionally, when the Node.js workload is installed, the New Project dialog will contain additional “Basic Vue.js Web Application” templates under the “JavaScript / Node.js” or “TypeScript / Node.js” paths.

Improved Vue.js support

ESLint support has been reimplemented in this release, and the following functionality is now enabled. Rather than only linting saved files, Visual Studio will now lint JavaScript files while they’re opened and being edited. Additionally, results will be reported for all JS files in your project, not just open files. If there are parts of your project you do not want to be linted, an .eslintignore file can now be used to specify directories and files that should be ignored. ESLint has been updated to use ESLint 4 by default, but if your project has a local installation of ESLint, it will use that version instead.

Visual Studio will now lint JavaScript files while they are open and being edited

Open Folder: Visual Studio 2017 version 15.8 Preview 3 includes productivity improvements to the Node.js and TypeScript development experience. For example, when working in Visual Studio via the “File / Open / Folder…” menu option, you can now right-click to bring up the context menu and select “Build” to build the TypeScript code with the latest TypeScript compiler, “Debug” to run the file with the debugger attached, or “Npm” to invoke NPM package management commands. Refer to the Visual Studio Preview release notes for details on how to configure your projects so these commands are available.

Npm right click context menu with Build Debug and Npm options

Improved Editor Performance: In previous releases, all JavaScript and TypeScript language service operations were serviced by a single Node.js process. This could cause editor delays if commands that impact user typing (such as automatic formatting after a newline) were sent while a potentially lengthy operation was already in process (such as analyzing code for errors). To mitigate this, a separate process is now used for the operations that most impact editing. This process is significantly lighter on system resources than the existing language service process.

C++ Development

C++ Templates IntelliSense: Visual Studio 2017 version 15.8 Preview 3 brings IntelliSense for Templates – you can provide more details about template arguments to take full advantage of IntelliSense within your template body.

C++ Refactoring: We’ve also added a new quick-fix lightbulb to convert basic macros to constexpr as a new tool to modernize your C++ code.

C++ Just My Code debugging enables you now to step-over code from system or 3rd party C++ libraries in addition to collapsing those calls in the call-stack window. You get to control this behavior for any C++ libraries when your code is compiled with /JMC (the default for Debug configurations) and the non-user libraries paths are specified in a .natjmc file. If the system library calls into user-code, when you step in, the debugger will skip all system code and will stop on the first line of user-code callback.

Code Analysis: We are continuously working to refresh our code analysis experience. You can now enable the new, in-progress features under Tools > Options > Text Editor > C++ > Experimental > Code Analysis. Code analysis can run in the background when files are opened or saved, and results will be displayed in the error list and as green squiggles in the editor.

Code analysis results displayed in the error list and as green squiggles in the editor

CMake: Adding configurations to CMakeSettings.json is now as simple as selecting a template.

C++ Standards: A new, experimental, token-based preprocessor that conforms to C++11 standards (including C99 preprocessor features), enabled with /experimental:preprocessor switch. This will be controlled with macro _MSVC_TRADITIONAL, which will be defined to 1 when using the traditional preprocessor and 0 when using the new experimental standards conformant preprocessor.

Spectre mitigations: The Visual Studio Developer Command Prompt now supports enabling the Visual C++ Spectre variant 1 mitigated runtimes (via the -vcvars_spectre_libs=spectre switch). More information about Spectre mitigations is available on the Visual C++ Team Blog.

Visual Basic Development

Integer manipulation: Visual Basic normally uses functions in the System.Math library when converting non-integer types to integers. This is generally beneficial as the conversion provides a rounded result. However, there is a performance penalty when rounding is not needed. Since Visual Basic programmers indicate truncation with Fix(), a new optimization will be applied to the pattern CInt(Fix(number)) which will improve performance is scenarios such as graphics manipulation.

Azure Development

Azure Functions: You can now configure continuous delivery for solutions with Azure Function Projects directly from Visual Studio 2017. Right click on the solution and click “Configure Continuous Delivery to Azure…”, to launch the dialog. Select “Azure Function” as a target host type and click OK. Visual Studio will automatically create a new Azure Function, a build definition and a release definition targeting that host to automatically deploy your azure function every time you update your code.”

Azure Key Vault: We have continued to build on the secret management experience by adding the ability to configure an Azure Key Vault with your published application. The Azure Key Vault provides a secure location to safeguard keys and other secrets used by applications so that they do not get shared unintentionally.  You can attach a Key Vault already connected with your local project to the published application through the publish summary page. Otherwise, setting a Key Vault up through the publish summary page will attach a Key Vault to both the published and local application.

Publishing Apps to Azure : When creating a new App Service, Visual Studio now offers the ability to configure Application Insights for your site as part of the initial creation process. Application Insights provides diagnostic, analytics, and performance data for your site. If Application Insights is available in the region of your site, it will automatically be enabled as indicated by the Application Insights dropdown being set to the same location as your App Service. If Application Insights is not currently available in the region as your site it will default to “None”, but you can still enable it by selecting a supported region to run Application Insights in for your site.

Web Development

A couple of months ago, we introduced a new Library Manager (LibMan), a lightweight effective solution for web developers to easily manage common client-side library files. Features of the Library Manager include support for common operations like restore and clean, as well as productivity aides like IntelliSense. In this Preview, the Library Manager added UI tooling to find and select library files in order to add them to your project. Additionally, LibMan now recognizes another provider, UnPkg, which provides access to all files available on the npm repository.

Library Manager added UI tooling to find and select library files in order to add them to your project

Mobile Development

Android Incremental Build Improvements: Xamarin.Android leverages files generated in the intermediate output directory to achieve incremental builds that are faster than full builds. Previously, if you changed your project’s target framework it would invalidate the files and result in a full build on the next run. In this release we now preserve the files in per-framework folders so you can switch between different target frameworks and still benefit from incremental builds. Cleaning the project will allow you to reclaim the disk space used by the preserved files.

Xamarin.Essentials APIs: Our Xamarin templates in Visual Studio 2017 have been updated to include Xamarin.Essentials, a core set of cross-platform APIs to help developers build native apps. Xamarin.Essentials gives developers access to over 30 platform-specific APIs that can be accessed from their shared code, including geolocation, secure storage, sensors, device information, and many more. Best of all, it can be used in any iOS, Android, UWP, or Xamarin.Forms app, regardless of how you create the user interface.

Try out the Preview today!

If you’re not familiar with Visual Studio Previews, take a moment to read the Visual Studio 2017 Release Rhythm. Remember that Visual Studio 2017 Previews can be installed side-by-side with other versions of Visual Studio and other installs of Visual Studio 2017 without adversely affecting either your machine or your productivity. Previews provide an opportunity for you to receive fixes faster and try out upcoming functionality before they become mainstream. Similarly, the Previews enable the Visual Studio engineering team to validate usage, incorporate suggestions, and detect flaws earlier in the development process. We are highly responsive to feedback coming in through the Previews and look forward to hearing from you.

Please get the Visual Studio Preview today, exercise your favorite workloads, and tell us what you think. If you have an Azure subscription, you can provision a virtual machine of this preview. You can report issues to us via the Report a Problem tool in Visual Studio or you can share a suggestion on UserVoice. You’ll be able to track your issues in the Visual Studio Developer Community where you can ask questions and find answers. You can also engage with us and other Visual Studio developers through our Visual Studio conversation in the Gitter community (requires GitHub account). Thank you for using the Visual Studio Previews.

Christine Ruana Principal Program Manager, Visual Studio

Christine is on the Visual Studio release engineering team and is responsible for making Visual Studio releases available to our customers around the world.

Announcing Template IntelliSense

$
0
0

C++ developers using function templates and class templates can now take full advantage of IntelliSense within their template bodies. In Visual Studio 2017 15.8 Preview 3, when your caret is inside a template, a new UI element called a “Template Bar” appears next to the template definition. The Template Bar allows you to provide sample template arguments for IntelliSense. 

For example, let’s look in the Boost library at the function template is_partitioned_until inside of algorithm.hpp (which I slightly modified for this demo). We can use the Template Bar to give IntelliSense an example of the InputIterator type and the UnaryPredicate type. 

  • Click the <T> icon to expand/collapse the Template Bar. 
  • Click the pencil icon or double-click the Template Bar to open the Edit window. 

Notice that we were able to use decltype on the UnaryPredicate called myPredicate. With this information provided, we have the full power of IntelliSense while we edit the template body. We get all the proper squiggles, quick info, parameter help, etc. 

We are considering the Template Bar information to be user-specific, thus it is stored in the .vs folder and not shared on commits. 

What’s Next? 

Download the latest Visual Studio 2017 Preview and try it with your projects. To disable/enable the feature, go to Tools > Options > C/C++ > Advanced > IntelliSense > Enable Template IntelliSense. 

We will continue to improve this feature in subsequent releases. We already have plans to support nested templates and to handle edits outside of Visual Studio. 

As with all new features, your feedback is very important in helping guide our development. You can send me your feedback on Twitter @nickuhlenhuth, or reach out to the Visual Studio C++ team @visualc or visualcpp@microsoft.com. 

Convert Macros to Constexpr

$
0
0

Visual Studio 2017 version 15.8 is currently available in preview. Today, Preview 3 has been released, and it comes with several features that improve the developer productivity experience. One key theme in 15.8 is code modernization, and macros are a key target for that. In 15.8 Preview 1, we announced the ability to expand macros in Quick Info tooltips, and now, for Preview 3, we are happy to announce a way to convert them to modern C++ constexpr expressions. The new preview includes a quick fix, accessible from the editor window, that identifies macros that can be converted to constexpr, and offers the option to perform the conversion, as a way to clean up and modernize your code. This feature (like editor features in general) is configurable and can be turned on/off as needed.

The macro -> constexpr Quick Fix

Right away, when viewing your code in the editor, you may notice some “…” on #define directives, under certain macros. These “…” are called Suggestions, and they are a separate category from errors (red squiggles; for most severe issues), and warnings (green squiggles; for moderately severe issues). A Suggestion covers low severity code issues.
Macro suggestion in editor
Opening the Quick Actions & Refactorings menu (with Alt + Enter or via the right-click menu) will show a new “Convert macro to constexpr” option.
Convert macro to constexpr quick-fix
When the option is selected, a preview window appears, summarizing the intended change:
Macro->constexpr preview window
Once the change is applied, the expression is converted to constexpr in the code editor:
Constexpr conversion
The feature works for constants, and it also works for basic expressions using function-like macros as well:
Function-like macro expressions
You may notice that the macro MAX above does not have the “…” under it. For function-like macros, we do not run a full preprocess to guarantee that the attempted conversion will be successful, to maintain stable IDE performance. Since we only want to show the Suggestion when we can guarantee that the conversion makes sense, we elect not to show the “…” indicator. However, you can still find the option to convert in the lightbulb menu, and we then fully process the macro when you click Apply in the preview window. In this case, this macro is converted to the following template:
Macro converted to template
Basically, you can always try to convert a macro to constexpr yourself, just don’t expect it to always work if you do not see a “…”. Not all macros are actually constexpr-able, since there are a wide range of macros that exhibit all sorts of behaviors that are unrelated to constants and expressions.

Tools > Options Configuration

You can configure the Macro->constexpr feature in Tools > Options Text Editor > C/C++ > View > Macros Convertible to constexpr. There, you can choose whether to display instances of it as Suggestions (default behavior), Warnings (green squiggles), Errors (build-breaking red squiggles), or None (to hide the editor indicator altogether) depending on your preference.
Tools > Options configuration

Give us your feedback!

This is our first release for this feature. We appreciate any feedback you may have on how we can make it better in the comments below. If you run into any bugs, please let us know via Help > Send Feedback > Report A Problem in the IDE.

New, experimental code analysis features in Visual Studio 2017 15.8 Preview 3

$
0
0

The Visual C++ team has been working to refresh our code analysis experience inside Visual Studio. We’re aiming to make these tools both more useful and natural to use and hope that they’ll benefit you no matter your workflow, style, or project type.

Trying out new features

In Visual Studio 2017 version 15.8 Preview 3, available in the Preview channel, we’ve introduced some new, in-progress code analysis features. These features are disabled by default, but you may enable them under Tools > Options > Text Editor > C++ > Experimental > Code Analysis. We encourage you to test them out and provide any feedback or comments you may have regarding your experience.

Background analysis

After enabling the features, code analysis will now run in the background when C++ files are opened or saved! Our goal here is to bring code analysis warnings into the editing experience so that bugs can be fixed earlier, and defects aren’t discovered only during build time. Once background code analysis runs for a file, warnings will be shown in the Error List and in the editor as squiggles.

In-editor warnings

Along with background analysis, code analysis warnings now display in-editor as green squiggles underneath the corresponding source code. In this Preview, if you change the file to fix a warning, the squiggles aren’t automatically refreshed. If the file is saved or analysis is re-run for the current file (Ctrl+Shift+Alt+F7), the squiggles and the Error List will be updated. We’re hoping these visual indicators will prove useful by giving you the ability to see code warnings in the same place you write and edit it.

Error List

Code analysis warnings will continue to be displayed in the Error List, but we’re trying to improve this experience as well. Filtering in the error list should be faster. We encourage using the “Current Document” filter to see only the errors for the files being edited. This pairs nicely with the background analysis feature. Warning details are also displayed in-line in the Error List instead of in a separate pop-up window. We’re believe having the details near the error makes it easier to dig into warnings. The new Error List experience is still a work in progress, so let us know of there are any “must have” features we should consider.

Future work

We’re excited to show off a preview of what’s to come, but for now there are a few known issues you may encounter. First, only the “Recommended Native Rules” ruleset is used when background analysis is run. Second, not all project types are supported with background analysis. You can always try running code analysis through the menus to force squiggles to refresh. Finally, the best way to clear squiggles for a project is to “Clean” build or disable the experimental feature.

Along with improved background analysis runs, highlighting for multi-line warnings, and a change to squiggles to show when warnings are out-of-date, automatic fix-its are also on the way. These IntelliSense-like lightbulb menus will give you the ability to rapidly correct or make changes to your code – and see exactly what will be changed – directly in the editor.

Send us feedback

Thank you to everyone who helps make Visual Studio a better experience for all. Your feedback is critical in ensuring we can deliver the best code analysis experience, so please let us know how Visual Studio 2017 version 15.8 Preview 3 is working for you in the comments below. General issues can be reported from within Visual Studio via Report a Problem, and you can provide any suggestions through UserVoice. You can also find us on Twitter (@VisualC).


Azure Data Factory new capabilities are now generally available

$
0
0

Microsoft is excited to announce the General Availability of new Azure Data Factory (ADF V2) features that will make data integration in the cloud easier than ever before. With a new browser-based user interface, you can accelerate your time to production by building and scheduling your data pipelines using drag and drop. Manage and monitor the health of your data integration projects at scale, wherever your data lives, in cloud or on-premises, with enterprise-grade security. ADF comes with support for over 70 data source connectors and enables you to easily dispatch data transformation jobs at scale to transform raw data into processed data that is ready for consumption by business analysts using their favorite BI tools or custom applications.

For existing SQL Server Integration Services (SSIS) users, ADF now allows you to easily lift and shift your SSIS packages into the cloud and run SSIS as a service with minimal changes required to your existing packages. ADF will now manage your SSIS resources for you so you can increase productivity and lower total cost of ownership. Meet your security and compliance needs while taking advantage of extensive capabilities and paying only for what you use.

All of these new Azure Data Factory capabilities fall under the Azure general availability SLA. Here is just a small sampling of the exciting new features now available in ADF:

  1. Control flow: Azure Data Factory includes control flow data pipeline constructs such as branching, looping, conditional execution, and parameterization in order to allow you to orchestrate complex data integration jobs that are flexible and reusable.
  2. Code-free data movement and orchestration designing: You can now design, manage, maintain, and monitor your pipelines right in your browser. Native integration with Git repos in Visual Studio Team Services allows your development teams to collaborate on data pipelines as well as build and release management and automation.
  3. Iterative pipeline development: The ADF design environment also includes the ability to iteratively develop pipelines and debug your pipelines interactively with iterative debugging built-in.
  4. Flexible scheduling: Schedule pipelines on a wall-clock scheduler, event-based triggers, or with tumbling window schedules.
  5. Lift and Shift SSIS into ADF: ADF provides Integration Runtime support that allows you to lift and shift your existing on-premises SSIS packages into the cloud by using Azure Data Factory where you can then execute, schedule, and monitor your SSIS package executions in the cloud.
  6. HDInsight Spark on-demand and Azure Databricks: Build ETL pipelines in the cloud with ADF that transform data at scale with Spark using HDInsight on-demand clusters or Azure Databricks Notebooks.
  7. SDK support: SDK support has been added and updated for Python, .NET, REST, and PowerShell to build custom applications with ADF.

With these exciting capabilities, Microsoft’s Cloud and Data partner Pragmatic Works is now able to engage with its customers in ways that were not possible before.

“Pragmatic Works helps customers solve business barriers once and for all with Azure cloud and data services and training. Azure Data Factory makes it extremely easy for Pragmatic Works’ customers to bring data together from on-premises and multi-cloud data sources. We can quickly build complex data integrations with the drag-and-drop browser user interface, allowing us to focus on delivering business value to our customers.”

Adam Jorgensen, President of Consulting, Pragmatic Works

Get started today!

Lift SQL Server Integration Services packages to Azure with Azure Data Factory

$
0
0

Data is vital to every app and experience we build today. With increasing amount of data, organizations do not want to be tied down by increasing infrastructural costs that come with it. Data engineers and developers are realizing the need to start moving their on-premise workloads to the cloud to take advantage of its massive scale and flexibility. Azure Data Factory capabilities are generally available for SQL Server Integration Services (SSIS) customers to easily lift SSIS packages to Azure gaining scalability, high availability, and lower TCO, while ADF manages resources for them.

image

Using code-free ADF UI/app, data engineers and developers can now provision and monitor Azure-SSIS Integration Runtime (IR) which are dedicated ADF servers for SSIS package executions. This capability now comes with amazing new features:

Data engineers and developers can continue to use familiar SQL Server Data Tools (SSDT) and SQL Server Management Studio (SSMS) to design, deploy, configure, execute, and monitor SSIS packages in the cloud. All of these capabilities are now generally available. Modernize and extend ETL workflows to automatically provision Azure-SSIS IR on-demand, just-in-time, inject built-in data transformations, and much more with SSIS in Azure Data Factory.

Get started

Azure simplifies cloud analytics

$
0
0

The modern business landscape is ruled by data, with analytics and AI now essential for driving transformation. Customers have benefited tremendously from the performance, flexibility, and low cost offered by Azure for analytics and AI workloads. Today, we are introducing new capabilities in Azure that make it easier to deliver, build, and manage powerful analytics and AI solutions.

First, we are excited to announce the preview of Azure Data Lake Storage Gen2, the only cloud scale data lake designed specifically for mission critical analytics and AI workloads. Azure Data Lake Storage Gen2 combines the scalability and cost benefits of object storage with the reliability and performance offered by the Hadoop file system capabilities.

We are also pleased to announce the general availability of new capabilities in Azure Data Factory. Now, integrating data from multiple sources to validate, enrich, and transform data for insights is dramatically simplified.

This evolution of the Microsoft analytics portfolio makes it easier for customers to integrate disparate data sources, then store and process large amounts of data economically to accelerate their digital transformation.

Taking Azure Data Lake Storage to the next level

Analytics solutions such as Hadoop have been designed assuming they run on scale out file systems. Other cloud providers shoehorn these solutions using a combination of client-side file system emulation and feature-deficit object stores resulting in poor performance and inconsistent reliability, ultimately forcing compromise.

Azure Data Lake Storage Gen2 offers a no-compromise data lake. It unifies the core capabilities from the first generation of Azure Data Lake with a Hadoop compatible file system endpoint now directly integrated into Azure Blob Storage. This enhancement combines the scale and cost benefits of object storage with the reliability and performance typically associated only with on-premises file systems. This new file system includes a full hierarchical namespace that makes files and folders first class citizens, translating to faster, more reliable analytic job execution.

Azure Data Lake Storage Gen2 also includes limitless storage ensuring capacity to meet the needs of even the largest, most complex workloads. In addition, Azure Data Lake Storage Gen2 will deliver on native integration with Azure Active Directory and support POSIX compliant ACLs to enable granular permission assignments on files and folders.

As Azure Data Lake Storage Gen2 is fully integrated with Blob storage, customers can access data through the new file system-oriented APIs or the object store APIs from Blob Storage. Customers also have all the benefits of Azure Blob Storage including encryption at rest, object level tiering, and lifecycle policies as well as HA/DR capabilities such as ZRS and GRS. All of this will come at a lower cost and lower overall TCO for customers’ analytics projects! Azure Data Lake Storage Gen2 is the most comprehensive data lake available anywhere. At general availability, Azure Data Lake Storage Gen2 will be available in all Azure regions.

To enable a seamless experience with leading Open Source providers of Hadoop and Spark analytics engines, we are working closely with our partners to make Azure Data Lake Storage Gen2 the most optimized data lake solution for customers.

“As a key partner, Cloudera has been working very closely with Microsoft since our integration of CDH with the first generation of Azure Data Lake. We are confident that Azure Data Lake Storage Gen2 will provide a superior experience for our CDH customers, specifically from a performance and stability perspective. We are very excited to announce our commitment in providing comprehensive platform support for Azure Data Lake Storage Gen2.”

- Vikram Makhija, General Manager for Cloud, Cloudera

Data integration simplified with Azure Data Factory

With proliferation of big data, organizations no longer wish to be weighed down by the complexity of integrating their data to drive the analytical insights their business requires. Now generally available, the new capabilities in Azure Data Factory, Azure’s cloud-based data ingestion and integration service makes it easier than ever before to drive raw data to actionable insights.

With a drag-and-drop graphical user interface, data engineers and developers can quickly and easily create, schedule, and manage data integration at scale. Azure Data Factory now supports code-free data ingestion from over 70 data source connectors to accelerate data movement across on-premises, cloud, and applications. We have also made a preview of a native Azure Data Factory connector available for Azure Data Lake Storage Gen2 in preview so customers can take advantage of Azure Data Lake Storage Gen2 and easily migrate data from other data sources including the first generation of Azure Data Lake.

Data engineers and developers can also easily lift SQL Server Integration Services (SSIS) packages to Azure and let Azure Data Factory manage their resources for them, achieving high scalability and availability while reducing operational costs. London-based data analytics consultancy Concentra Analytics has seen an 80 percent reduction in automated data warehouse development time by moving their SSIS packages to Azure.

“We have no problem working with customers that have data distributed between on-premises and cloud sources, even those with large datasets. With Azure Data Factory, our customers use the DataPlus auto-generated SSIS packages published in Azure to achieve scalability.”

- Weelin Lim, Director of Business Intelligence, Concentra Analytics

Azure is the best place for analytics

We are committed to making Azure the best place for organizations to unlock the insights hidden in their data to accelerate innovation. Customers can benefit from tight integration with other Azure Services for building end to end powerful cloud scale analytics solutions to support modern data warehousing, advanced analytics, and real-time analytics easily and more economically.

Big Data and advanced analytics

To find out more about Azure Data Lake Storage you can:

To find out more about Azure Data Factory you can:

Expanding Azure Certified for IoT program for the intelligent edge

$
0
0

Three years ago, we launched the Azure Certified for IoT program to help customers ensure their device of choice was tested to work with Azure IoT technology. Since then, our customers and partners have embraced the benefits of bridging the cloud with IoT devices together. With their enthusiasm, we have grown Azure Certified for IoT into one of the largest hardware ecosystems in the industry, with more than 250 partners and 1,000 different devices and starter kits already discoverable in the Azure IoT device catalog.

With the emergence of the intelligent edge and hardware innovations, we are expanding the certification program to support a wide range of hardware from low powered, constrained devices to AI-capable industrial gateways. We introduced Azure IoT Edge as a fully supported edge offering over a year ago, supporting Windows and Linux devices, and have seen huge customer momentum with increasing use of devices at the edge.

"Intelligent computing with real-time analytics at the edge is a key trend going forward – and increasingly a business requirement in the IoT business.”

–  Tomoyasu Suzuki, President of Plat'Home Co., Ltd

Today, we are excited to announce certification of Azure IoT Edge devices in the Azure Certified for IoT program, supporting certification of core functionalities such as device management, security, and advanced analytics. We have seen extremely positive momentum from hardware partners such as Advantech, Beckhoff Automation, Dell, HPe, Moxa, NexCom, Plat’Home, and Toshiba — and can’t wait for you to join us.

Azure IoT Edge device certification program overview

IoT Edge certification program has the capability-based certification concept. Each capability has its own level to provide granularity of the IoT Edge device that device seekers are looking for, and allows the Azure Certified for IoT program to evolve in the future.

Each capability contains its own leveling with “Level 1” being the lowest.

For the device to be certified as IoT Edge device, the device needs to pass all mandatory requirements:

  • [Mandatory] Edge runtime (Level 1 only)
  • [Mandatory] Device management (Level 1 only)
  • [Optional] Security (4 levels: Level 1 – 4)

Device prerequisites

An IoT Edge device is required to pre-install Azure IoT Edge runtime to be certified as Azure IoT Edge device. Pre-installing IoT Edge runtime in your device can occur at multiple stages in value chain.

IoT Edge device certification is certifying against the pre-installed Azure IoT Edge runtime in the device controlled by either OEMs or channels to provide the best out-of-the-box experience on IoT Edge devices. However, this does not mean that Azure IoT Edge runtime do not run or support devices that are not certified.

Certification criteria: Description of capabilities and levels

Below describes the IoT Edge device certification criteria and associated capabilities for each level.

  • Device management: Basic device management operations (reboot, FW/OS upgrades) triggered by messages from IoT Hub.
  • Security: Azure IoT Edge is secure from the ground up. However, additional threats with operating at the edge demands security enforcements using secure hardware. This certification aims to communicate diligence to security that goes above and beyond, provided by Azure IoT Edge as in deployment using HSM secured devices.

The below capabilities describe the risks within the device’s mitigation capabilities. It is neither a security guarantee nor a statement of the strength of security.

Security feature Standard feature Secure element Secure enclave
Secure hardware requirements None Standalone security processor (e.g. TPM and secure elements) Integrated security processor
Expectation Edge base security processes

Secure hardware protection of storage and use of secrets (e.g. keys)

Secure element features, plus protection of execution environment
Examples of typical transactions All transactions in accordance with deployment risk assessment
  • Authentication
  • Session key generation
  • Certificates processing

All of the secure element transactions plus:

  • Metering
  • Billing
  • Secure I/O
  • Secure Logging
Max security grading Level 2 Level 4 Level 4
Grading Level 1 Level 2 Level 3 Level 4
Requirements
  • Custom
  • Azure Device SDK
  • Azure Device SDK
  • FIPS 140-2 Level 2
  • Common Criteria EAL 3+
  • Azure Device SDK
  • FIPS 140-2 Level 3
  • Common Criteria EAL 4+

Read Microsoft’s approach to deliver a secure platform for Azure IoT Edge devices in the blog post Securing the intelligent edge. Microsoft is working to define validation process for security requirement including exploration of leveraging 3rd party validation labs.

Next steps

If you are a hardware partner and want to certify your IoT Edge device today, you can submit your IoT Edge device through the partner dashboard.

If you have any questions, please contact Azure Certified for IoT at iotcert@microsoft.com.

Microsoft from GeekWire Cloud Tech Summit: New Azure innovations will advance the intelligent cloud and intelligent edge

$
0
0

Today, I gathered with the tech community in the Seattle area at the GeekWire Cloud Tech Summit to talk about how customers are using the cloud and what the future holds. I joined GeekWire’s Todd Bishop and Tom Krazit on stage for a fireside chat to share more about Microsoft’s vision for emerging cloud innovation, but I also got to connect with many of you directly. In those conversations, it became even more apparent just how many of you are turning to the intelligent cloud to explore how emerging innovation like serverless, blockchain, edge computing, and AI can help you create solutions that can change your business — and people’s lives.

At Microsoft, we’re continually releasing technology that’s inspired by our customers and what you tell us you need to make the intelligent cloud and intelligent edge a reality for your businesses. For example, you may need to build applications to work in remote areas with low connectivity. Or you need to store, access, and drive insights from your data faster because of competitive pressures. And, you need confidence that your data and applications will be secure, resilient, and highly available across the globe.

During my time with Tom and Todd, I announced a few of our latest Azure solutions and newest regions and availability zones designed to bring this vision to you, our customers, and I’d like to share more on these new technologies.

Introducing the next level in big data analytics

Azure Data Lake Storage Gen2 and general availability of Azure Data Factory capabilities

Data is currency for enterprises today, and we know you need to be able to easily store and quickly access your data to drive actionable insights. You also need to be able to ingest and integrate big data quickly and easily to get to insights more readily. Today, we are introducing a preview of a highly scalable, highly performant, and cost-effective data lake solution for big data analytics, Azure Data Lake Storage Gen 2, to deliver the scale, performance, and security needed by your most demanding and sensitive workloads. 

Because Azure Data Lake Gen2 is built on the foundations of Azure blob storage, all data — from data that is constantly in use through to data that only needs to be referenced occasionally or stored for regulatory reasons — can coexist in a single store without having to copy data. This means that you will have greater speed to insight over your data along with rich security at a more cost-effective price. Azure Data Lake Storage also provides a unified data store where unstructured object data and file data can be accessed concurrently via Blob Storage and Hadoop File System protocols.

Today also brings the general availability of new features in Azure Data Factory to deliver data movement as a service capacities so you can build analytics across hybrid and multicloud environments, and drive raw data into actionable insights. The new features include a web-based graphical user interface to create, schedule and manage data pipelines, code-free data ingestion from over 70 data source connectors to accelerate data movement across on-premises and cloud, and ability to easily lift SQL Server Integration Services packages to Azure and run in managed execution environment in Azure Data Factory. You can also start taking advantage of native ADF connector for Azure Data Lake Storage to load your data lake at scale.

Enabling the intelligent edge

Azure IoT Edge is generally available

In the next 10 years, nearly all our everyday devices and many new devices will be connected. These devices are all becoming so “smart” that they can power advanced algorithms that help them see, listen, reason, predict and more, without a 24/7 dependence on the cloud. This is the intelligent edge, and it will define the next wave of innovation in how we address world issues: distributing resources like water and oil, increasing food production and quality, and responding to natural disasters.

As key part of our strategy to deliver the promise of edge computing is Azure IoT Edge, which enables consistency between cloud and edge. This means you can push AI and machine learning to the edge, providing the most comprehensive and innovative edge offering on the market. As of today, Azure IoT Edge is generally available globally, with new updates for increased flexibility, scalability, security, and more.

Also today, the Azure IoT Edge runtime is open sourced and available on GitHub. If you are a developer, this gives you even greater flexibility and control of your edge solutions, so you can modify the runtime and debug issues. Azure IoT Edge now also supports more languages than another other edge solution including C#, C, Node.js, Python, and Java, and we’ve added support for the Moby container management system. Additionally, we’ve released a Device Provisioning Service that enables you to provision tens of thousands of devices with zero touch. The new Security Manager for Azure IoT Edge acts as a well-bounded security core for protecting the IoT Edge device and all its components by abstracting the secure silicon hardware.

Azure IoT Edge customers like Schneider Electric and a farmer in Carnation, Washington are building sophisticated solutions that deliver real-time insights in areas with unreliable connectivity. Now that the solution is production-ready, with enhanced features, we can’t wait to see what else you build.

Azure global infrastructure

New Azure regions

We continuously invest in our cloud infrastructure to give you more compute power to enable the intelligent cloud and intelligent edge. We’ve announced 54 Azure regions to help you deliver cloud services and apps to nearly every corner of the globe and to provide everything that’s needed to run mission-critical applications, across scenarios, with a full set of resiliency solutions.

Today, we expanded our Azure presence in China, one of the most dynamic cloud markets in the world, with two additional regions now generally available. We were the first international cloud provider in China in 2014 (in partnership with 21Vianet), and today’s announcement doubles the number of Azure regions available there. We continue to see immense opportunity in China for cloud services to fuel innovation and multinational corporations including Adobe, Coke, Costco, Daimler, Ford, Nuance, P&G, and Toyota, which are choosing our intelligent cloud services to help deliver for their customers in China. This builds on our recently announced plans to expand our cloud infrastructure in Europe and the Middle East and announced plans for new regions coming to Norway

We’re also constantly increasing Azure’s resiliency capabilities with the addition of new Azure Availability Zones. Our Availability Zone in the Netherlands is now generally available, adding to the Zones already available in Iowa and Paris. The combination of region pairs and Availability Zones not only increases Azure’s resiliency capabilities, but broadens customer choice for business continuity architectures and delivers an industry-leading SLA for virtual machines.

It’s an exciting future

It was great to see all of you at GeekWire Cloud Tech Summit today. For those of you who weren’t able to be there live, you can follow along online. As always, we will continue to focus on building the technologies you need to drive innovation and disruption with the intelligent cloud and intelligent edge. Let us know what you think about these new solutions by sharing your feedback and comments.

Azure IoT Edge generally available for enterprise-grade, scaled deployments

$
0
0

Since we introduced Azure IoT Edge just over a year ago, we have seen many examples of the real-world impact from the factory floor to the farm to run cloud intelligence directly on IoT devices. Now devices can act immediately on real-time data—whether it be recognizing a crack in a pipe from an aerial view or predicting equipment failure before it happens. As we evolve toward a world of ubiquitous computing, the design of the IoT solution spanning hardware, edge, and cloud must be consistent and secure to drive real impact. Today, we are excited to announce Azure IoT Edge is now generally available (GA) globally – enabling our growing list of enterprise customers to bring their edge solutions to production.

We are also introducing new robust capabilities on Azure IoT Edge to easily develop and deploy intelligence to the edge. These robust updates position Azure IoT Edge as a true end-to-end solution for enterprise-grade edge deployments.

New updates to develop and deploy intelligent applications to Azure IoT Edge

Open and flexible for greater choice

  • Open source Azure IoT Edge: Today with GA, IoT Edge is open sourced and available on GitHub. This continues our commitment to open source support for Azure IoT Edge and gives developers even greater flexibility and control of their edge solutions, enabling them to modify the runtime and debug issues.
  • Support for Moby container management system: Moby is the open-source platform Docker is built on, allowing us to extend the concepts of containerization, isolation, and management from the cloud to devices at the edge. Moby containers work on Docker-based systems, and vice versa. There are no changes required to existing (Docker-based) modules.
  • Ecosystem of certified hardware and software for the edge: We are expanding the Azure Certified for IoT program to certify core edge functionalities such as device management and security. You can find already certified edge hardware on the device catalog. Learn more about the Azure Certified for IoT program. In addition to hardware, developers can find pre-built edge modules now available through Azure Marketplace to accelerate edge solution development.

Secure from hardware to cloud for solutions that scale

  • Azure IoT Device Provisioning Service: Azure IoT Edge now has deep integration with Device Provisioning Service for zero-touch provisioning so that a device can simply be provisioned in the field with no operator intervention. With Device Provisioning Service, customers can securely provision tens of thousands of devices – bringing true scale to edge deployments.
  • Azure IoT Edge security manager acts as a well-bounded security core for protecting the IoT Edge device and all its components by abstracting the secure silicon hardware. It is the focal point for security hardening and provides Original Device Manufacturers (OEM) the opportunity to harden their devices based on their choice of Hardware Secure Modules (HSM).
  • Automatic Device Management (ADM) service allows scaled deployment of IoT Edge modules to a fleet of devices based on device meta data. When a device with the right meta data (tags) joins the fleet, ADM brings down the right modules and puts the edge device in the correct state.

New and simplified developer experience

  • Broad language support for module SDKs: Azure IoT Edge supports more languages than other edge offerings today, including C#, C, Node.js, Python, and Java so you can program edge modules in the language of your choice.
  • Tooling for VSCode: Simplify module development by coding, testing, debugging, and deploying all from VSCode.
  • CI/CD pipeline with VSTS allows managing of the complete lifecycle of the Azure IoT Edge modules from development, testing, staging, and finally deployment. All of this is now possible in VSTS using tools already familiar to developers.

This continues recent momentum and news at our //Build developer conference last month, where we announced new IoT Edge capabilities including integration with Microsoft AI services and container support for Kubernetes, new partnerships, and integrations with third-party hardware including DJI drones and Qualcomm vision AI developer kit at the edge.

Customer and partner momentum

Our preview customers are building sophisticated solutions that enable them to gain real-time insights in remote areas or areas with unreliable connectivity using Microsoft’s AI services supported on Azure IoT Edge. This includes Schneider Electric, a leader in energy, who is doing predictive maintenance on equipment that signals it’s about to break so that it does not become a safety and waste hazard. And a farmer in Carnation, Washington using DJI drones and Microsoft’s FarmBeats solution on Azure IoT Edge to do precision agriculture. Here is what a few of our customers and partners have to say:

  • At Chevron, we are committed to producing the energy that powers the world. As such, it is imperative that we innovate with the latest technology to continuously improve safety and reliability and enhance our company’s performance. Edge computing is the next wave of cloud and IIOT innovation. With the ability to run Azure IoT Edge on Azure Stack, we see an opportunity to increase uptime and real-time insights on the performance of our operating equipment with intelligent applications that can run right on the device and in remote areas with limited or interrupted connectivity.” –  Deon Rae, Chevron Fellow & IIOT Center of Excellence Lead
  • "As a leading manufacturer of industrial equipment, we leverage cloud intelligence to understand how the equipment is performing and when maintenance is needed. With Azure IoT Edge, we have the potential to bring this intelligence down to the device level, to predict failures and maintenance in real time, without a 24/7 dependence on connectivity." –  Alasdair Monk, Group Product Management Director IoT, Weir Group​
  • “At Vulcan Steel we are committed to promoting a safe working environment. With Microsoft, we saw an opportunity to increase safety bringing cloud intelligence to the edge using AI and Azure IoT Edge to review thousands of pieces of footage a day and highlight potentially risky behavior that could lead to an accident in the loading and unloading of trucks. This real-time intelligence enables us to use predictive insights to direct education efforts, improving safety for our people.” – James Wells, Chief Information Officer, Vulcan Steel
  • “Amano McGann has a reputation for bringing leading, innovative technology to the smart parking and urban mobility industry across North America. We are extending our work with Microsoft Azure to the edge with Azure IoT Edge to increase reliability and uptime of our services, even with limited or no connectivity. We have the industry’s largest and most experienced staff of engineers and R&D specialists working to bring technologically advanced products to the market, and we are proud to take this to the next wave of innovation to the edge.” – Rohit Chande, Senior Vice President of Engineering Services, Amano McGann
  • “Redis Labs is delighted to be partnering with Microsoft Azure on IoT Edge solutions. This is a natural partnership as both companies are putting significant focus and investment in accelerating intelligence and computing at the edge. The combination of Redis Enterprise in the Azure IoT Edge solution will enable joint customers and partners to enjoy fast performance, in-memory processing and high availability for their edge solutions. Azure IoT Edge together with Redis Enterprise will increase speed to business value with edge solutions.” – Rob Schauble, Vice President of IoT & Emerging Technologies, Redis Labs
  • “Moxa selected Azure IoT Edge for customers who have standardized on Microsoft and Azure and want to extend their IT infrastructure to high-volume industrial edge deployments. Our integration with Microsoft Azure as a Strategic Partner will ensure customers can more easily deploy IIoT Edge Gateways into existing industrial environments and build upon Moxa’s 30 years of industrial expertise for connectivity, long-term Industrial Linux support, and a broad range of industrial connectivity.” – Bee Lee, Executive President, Moxa Sales and Marketing

Azure IoT Edge pricing

There are three components required for Azure IoT Edge deployment: Azure IoT Edge Runtime, Azure IoT Hub, and edge modules. The Azure IoT Edge runtime is free and will be available as open source code. Customers will need an Azure IoT Hub instance for edge device management and deployment if they are not using one for their IoT solution already. Learn more about first-party services pricing.

Start developing intelligent apps for the edge with Azure IoT Edge today

Just this spring, we announced that Microsoft will invest $5B in IoT and the intelligent edge over the next four years to accelerate innovation while also simplifying the customer experience to build IoT solutions. Today’s announcement is the latest example of innovation in IoT and edge. With today’s news, we are also releasing tutorials, demos, and videos so you can get started quickly today. We look forward to seeing what you go build.

Enterprises get deeper insights with Hadoop and Spark updates on Azure HDInsight

$
0
0

Azure HDInsight is one of the most popular services amongst enterprise for open source Hadoop & Spark analytics on Azure. With the plus 50 percent price cut on HDInsight, customers moving to the cloud are reaping more savings than ever.

PROS is a pioneer in using machine learning to give companies an accurate and profitable pricing. PROS Guidance product runs enormously complex pricing calculations based on variables that comprise multiple terabytes of data. In Azure HDInsight, a process that formerly took several days now takes just a few minutes.”- Ed Gonzalez, Product Manager, PROS

Today we are announcing updates to Apache Spark, Apache Kafka, ML Services, Azure Data Lake Storage Gen2 and enhancements to Enterprise Security Package. These new capabilities will continue to drive savings for many of our customers. In addition to this, Microsoft is continuing to deepen its commitment to the Apache Hadoop ecosystem and has extended its partnership with Hortonworks to bring the best of Apache Hadoop and the open source big data analytics to the Cloud.

PartnerLogos

Continued investment in Open Source for new capabilities and reliability

Reliable Open Source

Microsoft’s is contributing to Apache Hadoop ecosystem and also ensuring Azure is the most reliable place to run this ecosystem.  The ecosystem has dependencies on many open source projects which need to be configured together for the stack to work. HDInsight provides pre-tuned clusters out of the box for the best performance. Today we are enabling updates to Apache Hadoop, Apache Spark 2.3, Apache Kafka 1.0 and 2000 plus bug fixes across 20 plus Open Source frameworks that are part of HDInsight. 

“HDInsight is the best place to run open source frameworks for big data. It meets key enterprise needs allowing them to easily modernize their on-premise solutions on a fully managed HDInsight cloud service.” - Rohan Kumar, Corporate Vice President, Microsoft

Enabling data scientists with Machine Learning Services 9.3

Today, we are excited to announce the general availability of Machine Learning (ML) Services 9.3 on Azure HDInsight. With this release, we are providing data scientists and engineers with the best of open source enhanced with algorithmic innovations and ease of operationalization, all available in their preferred language with the speed of Apache Spark. This release expands upon the capabilities offered in R Server with added support for Python, leading to the cluster name change from R Server to ML Services. Learn more with this introduction to R Server and open-source R capabilities on HDInsight.

Introducing an all new Azure Data Lake Storage Gen2, the Data Lake for everyone

Gen2 Storage

Today’s Data Lake options require customers to choose between scalability, availability, and cost with the security they require. Azure Data Lake Storage provides a no-compromise foundation for building scalable, and highly performant big data analytics solutions without trading off security and cost. Microsoft is announcing a preview of Azure Data Lake Storage Gen2, a globally available HDFS filesystem to store and analyze petabyte-size files and trillions of objects. Today we are excited to announce a preview of HDInsight with Azure Data Lake Storage Gen2.

Uncompromising on security: Virtual Network Service Endpoints

Service Endpoints

Today we are enhancing HDInsight to include support for Virtual Network Service Endpoints which allows customers to securely connect to Azure Blob Storage, Azure Data Lake Storage Gen2, Cosmos DB and SQL databases. By enabling a Service Endpoint for Azure HDInsight, traffic flows through a secured route from within the Azure data center.

Secure and compliant

HDInsight brings enterprise grade protection of your data with encryption, virtual networks, Active Directory based authentication, role-based authorization, fine grained access control, single pane of glass for monitoring and more. The service is globally available in 20 plus regions including sovereign clouds in US, Germany and China and meets key compliance standards such has HIPPA, PCI, ISO and more.

Delivering cost-effectiveness

50 percent price cut

With pay per use billing, on demand clusters, scale up or down and separation of compute and storage customers are able to run their big data jobs more efficiently. We reduced the price of HDInsight by plus 50 percent, so you can enjoy even more cost savings.

Try HDInsight now

We hope you take full advantage of today’s announcements and we are excited to see what you will build with Azure HDInsight. Read this developer guide and follow the quick start guide to learn more about implementing these pipelines and architectures on Azure HDInsight. Stay up-to-date on the latest Azure HDInsight news and features by following us on Twitter #HDInsight and @AzureHDInsight. For questions and feedback, please reach out to AskHDInsight@microsoft.com.

About HDInsight

Azure HDInsight is Microsoft’s premium managed offering for running open source workloads on Azure. Today, we are excited to announce several new capabilities across a wide range of OSS frameworks.

Azure HDInsight powers some of the top customer’s mission critical applications ranging in a wide variety of sectors including, manufacturing, retail education, nonprofit, government, healthcare, media, banking, telecommunication, insurance and many more industries ranging in use cases from ETL to Data Warehousing, from Machine Learning to IoT and many more.


Azure Elastic Database jobs is now in public preview

$
0
0

We are excited to announce the availability of a new, significantly upgraded public preview release of Azure Elastic Database jobs. Elastic Database jobs is now a fully Azure-hosted service. Unlike the earlier, customer-hosted and managed version of Elastic Database jobs, this version is an integral part of Azure with no additional services or components to install and configure. This release also adds significant capabilities making it easy for customers to automate and execute T-SQL jobs using PowerShell, REST, or T-SQL APIs against a group of databases. These jobs can be used to handle a wide variety of tasks such as index rebuilding, schema updates, collection of query results for analytics, and performance monitoring.

Imagine you are a SaaS developer in the cloud offering online ordering services for a collection of large stores. To support this and allow unlimited scale, you’ve provisioned separate Azure SQL databases to handle each store’s business, and this set of databases share a common schema. While your application directs customer transactions to the appropriate store’s databases, you want the capability to easily manage all databases jointly to ensure the performance and customer experience remain optimal. Additionally, you periodically have new schema to deploy to each database in preparation for a new feature you are planning to integrate into your application for your customers.

As new customers are added or dropped, additional databases need to be created or dropped without you having to change your maintenance job scripts. Azure Elastic Database jobs allow you to achieve all this and more using your familiar T-SQL based job scripts. These jobs can be scheduled, run, managed, and monitored using T-SQL, PowerShell, REST APIs, and the Azure portal. This updated version of Elastic Database jobs provides all these capabilities and adds significant additional features.

elastic-jobs

Some of the features and enhancements in this release include:

  • Elastic Database jobs is now a fully integrated Azure service.
  • Jobs can target databases in one or more Azure SQL elastic pools, logical servers, shard maps, and across multiple subscriptions.
  • Jobs can be composed of multiple steps to customize the execution sequence.
  • List of target databases in the target groups are dynamically enumerated. Any databases added or dropped from the target group are automatically picked up without having to make explicit job script changes.
  • Jobs can be configured to limit the number of databases a job runs against in parallel to ensure resource optimization.
  • Target groups can be customized with “include” or “exclude” references for specific databases, pools, or servers.
  • T-SQL, PowerShell, REST APIs, and Azure portal support.

Azure Elastic Database jobs makes automation of database maintenance and management across a large number of databases simpler and reliable. Our tutorial walks you through the necessary steps to create and execute jobs over a group of databases, as well as to create schedules, execute jobs, and store the results. Elastic Database jobs can perform operations on databases in all service tiers of Azure SQL Database.

Next steps

The Elastic Database jobs is part of Azure SQL Database service and there is no additional charge for using the public preview version. To get started, access the public preview of Elastic Database jobs and read more about getting started in our documentation.

Migration scripts are available for customers using the old customer-hosted preview version of Elastic Database jobs.

Use Azure Active Directory with Spring Security 5.0 for OAuth 2.0

$
0
0

We are excited to announce that Spring Starter for Azure Active Directory (AD) is now integrated with Spring Security 5.0. It offers you an easy way to build OAuth2.0 authentication and authorization flow for your Java apps in the cloud, supporting both implicit and authorization code grant types. With only a few lines of configuration, you can build apps that perform authentication with Azure Active Directory OAuth2 and manage authorization with Azure Active Directory groups.


Spring Initializr

Get started

To start, open the Azure portal and register a new application in Azure Active Directory (AD). Next, grant permissions to the newly created application. Use Azure Active Directory’s group and member to set up the access rules. Add the Spring Security Azure AD library to your project. Depending on the kind of application that you’re building, choose from the following two authentication types to build up OAuth2.0 authentication and authorization flow. Learn more about Spring Starter for Azure Active Directory on GitHub.

Back-end authentication

Once the library of Spring Security Azure AD is added to the project, it will automatically map the Azure AD groups and Spring Security authorization logics. It allows developers to build the OAuth2.0 flow in the back end. To enable that, you only need to add the following configurations to specify the usage of OAuth2 User Service. Then you can use the annotation @PreAuthorize("hasRole('GROUP_NAME')") for role-based authorization. To learn more, please review our example on GitHub.

@Autowired
private OAuth2UserService<OidcUserRequest, OidcUser> oidcUserService;

@Override
protected void configure(HttpSecurity http) throws Exception {
     http
             .authorizeRequests()
             .anyRequest().authenticated()
             .and()
             .oauth2Login()
             .userInfoEndpoint()
             .oidcUserService(oidcUserService);
}

Front-end authentication

For a Single Page Application (SPA) scenario, use Azure AD library for JavaScript to handle Azure AD authentication in the front end, and autowire the AADAuthenticationFilter in your Spring Boot project. Then you can use the annotation @PreAuthorize("hasRole('GROUP_NAME')") for role-based authorization. Learn more by reviewing the Azure Active Directory Spring Boot sample.

Next steps

Check out our project on GitHub and learn about Spring integrations with Azure services.

Feedback

Please share your feedback and ask questions to help us improve by commenting below or contacting us on GitHub.

Automatic device management, module identity, and module twin are now generally available

$
0
0

Last month, we released Azure IoT Hub automatic device management, and module identity and module twins. Each of these features enable scenarios to enhance device management capabilities within your IoT application built on Azure IoT Hub. Today, we are excited to announce that they are generally available with the same great support you’ve come to know and expect from Azure IoT services.

Automatic device management

Automatic device management automates many of the repetitive and complex tasks of managing large device fleets over the entirety of their lifecycles. With automatic device management, you can target a set of devices based on their properties, define a desired configuration, and let IoT Hub update devices whenever they come into scope. We offer two services in automatic device management for different scenarios - automatic device configurations and IoT Edge automatic deployments.

Building blocks for Azure IoT device management

Automatic device configurations

Automatic device configurations provides the ability to perform IoT device configuration at scale including updating settings, installing software, and updating firmware with reporting and conflict resolution automatically handled. It provides an additional layer of capability by building upon existing platform primitives, namely device twins, and queries.

With general availability support comes expanded SDK support. We now support service SDKs in the following languages: C, C#, Java, Node, and Python (coming soon). In order for you to achieve full automation, this release also corresponds to the IoT extension for Azure CLI capability for automatic device configuration.

IoT Edge automatic deployments

IoT Edge automatic deployments enable cloud-driven deployment of Azure services and solution-specific code to IoT Edge devices. IoT Edge automatic deployments behave very similar to automatic device configurations by automatically updating devices that come into scope, handling reporting, and merge conflicts. This general availability now support IoT Edge deployments in the IoT extension for Azure CLI, as well as our SDKs across all of our supported languages. With IoT Edge being generally available globally today, our growing list of enterprise customers can bring their edge solutions to production. For further details, see Sam George’s blog post Azure IoT Edge generally available for enterprise-grade, scaled deployments for the latest news on Azure IoT Edge.

Module identity and module twin

Module identity and module twin are similar to Azure IoT Hub device identity and device twin, but provide finer granularity. While Azure IoT Hub device identity and device twin enable the back-end application to configure a device and provide visibility on the device’s conditions, a module identity and module twin provide these capabilities for individual components of a device. On capable devices with multiple components, such as operating system based devices or firmware devices, it allows for isolated configuration and conditions for each component. Module identity and module twin provide a management separation of concerns when working with IoT devices that have modular software components.

With this release, module identity and module twin now support both AMQP and MQTT on all the operations. This release also corresponds to expanded SDK language support for both the device and client side. We now support SDKs in the following languages including C, C#, Java, Node.js, and Python. Get started with module identity and module twin today with the quick start tutorials.

We would love to hear your feedback on IoT Hub device management, so please continue to submit your suggestions through the Azure IoT User Voice forum.

The Financial Times and BBC use R for publication graphics

$
0
0

While graphics guru Edward Tufte recently claimed that "R coders and users just can't do words on graphics and typography" and need additonal tools to make graphics that aren't "clunky", data journalists at major publications beg to differ. The BBC has been creating graphics "purely in R" for some time, with a typography style matching that of the BBC website. Senior BBC Data Journalist Christine Jeavans offers several examples, including this chart of life expectancy differences between men and women:

BBC-gender

... and this chart on gender pay gaps at large British banks:

BBC-pay

Meanwhile, the chart below was made for the Financial Times using just R and the ggplot2 package, "down to the custom FT font and the white bar in the top left", according to data journalist John Burn-Murdoch.

FT-world-cup

There are also entire collections devoted to recreating Tufte's own visualizations in R, presumably meeting his typography standards. Tufte later clarified saying "Problem is not code, problem is published practice", which is true of any programming environment, which is why it was strange that he'd call out R in particular.

CMake Support in Visual Studio – Configuration Templates

$
0
0

Visual Studio 2017 15.8 Preview 3 is now available and it includes several improvements to the CMake tools. In addition to a few fixes we have simplified the way you can configure your CMakeSettings.json file by adding configuration templates.

If you are new to CMake in Visual Studio, check out how to get started.

Configuration Templates for CMake

If you have created a CMakeSettings.json file to customize your project’s settings in the past you might know that that file could be a little daunting. With both the Desktop and Linux workloads installed, the default CMakeSettings.json template was over 100 lines long. We received feedback from the community that this file was difficult to unpack and understand, so we have simplified the way we populate the default CMakeSettings.json file. We have also provided a way to access and edit this configuration file directly from the configuration selector.

Getting Started

When you open a CMake folder for the first time, we now implicitly create just one debug configuration that matches the current operating system. This way, you are still able to get up and running to target the local machine without needing to create a CMakeSettings.json file at all. However, once you do want to add or customize the CMake configuration list, it is easier than ever to do so:

Create CMakeSettings from Config Dropdown

Clicking “Manage Configurations…” in the configuration selection dropdown (right of the F5 button) will ask you to select a template that best fits how you want to debug your project. Visual Studio will add the selected template to a newly created CMakeSettings.json in the root CMakeLists file’s folder.

Configuration Template Selector

This has all the configurations we supported before plus a few new ones for targeting MinGW and ARM IoT devices. Once you select a template, it will be added to your CMakeSettings.json file. If we take “x86-Debug” for example, we see that it results in a very compact CMakeSettings.json:

New CMakeSettings.json Content

At only 17 lines, this is quite a bit more manageable than this file has been in the past.

Adding New Configurations

Once you create this file, the “Manage Configurations…” button in the configuration selection dropdown will take you back to the CMakeSettings.json file. However, that doesn’t mean you can’t add more templates to it. To add new templates after creating the file just right click anywhere on the editor and select Add Configuration:

Add a Configuration from the Editor

Alternatively, you can right click on the file itself in the Solution Explorer:

Add Configuration from the Solution Explorer

The newly selected template will be appended to the end of your configuration file.

Custom Configuration Templates (Experimental)

In addition to the built-in configuration templates, we have added an experimental way for you to create your own templates as well. Under Tools > Options > CMake there is now a “CMakeSettings.json Template Directory” entry:

Custom Templates in Tools > Options > CMake

You can point this at a directory with one or more JSON files containing custom templates for CMake. These files can have any name, and essentially have the same format as a standard CMakeSettings.json file. The array of “configurations” will be treated as templates and will appear in the template selector. To set their description in the template sector just add a “description” field to the configuration’s JSON.

We would love to hear how you use this and if it works well for you. Your feedback will help us shape this feature in future versions of Visual Studio.

Community Reported Issues and Other Improvements

We have fixed quite a few customer reported issues in the latest Visual Studio Preview, including several related to our IDE integration with CTest:

In addition to the issues reported by the community, we have also addressed the following customer impacting issues:

  • Fixed long delay when closing some CMake projects
  • Opening a new CMake project without closing a previously opened project no longer locks up the IDE
  • CMake project code scanning no longer waits forever if filters are defined in “.vscodesettings.json”
  • C# CMake projects now default to the Visual Studio generator and F5 starts the managed/mixed debugger (experimental)

Thanks again to everyone who reported these issues!

In addition to the changes to CMakeSettings and configuration, the latest version of the CMake tools also includes improved single file compilation. All errors and warnings now appear in the Error List instead of only the Output Window.

Send Us Feedback

Your feedback is a critical part of ensuring that we can deliver the best CMake experience.  We would love to know how Visual Studio 2017 Preview is working for you.  If you have any feedback specific to CMake Tools, please reach out to cmake@microsoft.com.  For general issues please Report a Problem.

Viewing all 10804 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>