Quantcast
Channel: Category Name
Viewing all 10804 articles
Browse latest View live

Completed UserVoice Suggestions in Visual Studio for C++ developers

$
0
0

If you regularly follow our blog, you may have noticed that our posts encourage you to submit your suggestions on how to improve Visual Studio in UserVoice. We spend a lot of time reviewing your suggestions and incorporating them into our planning for future releases.

In this post, we’d like to give you an update on what suggestions we have recently marked as Complete. This list comes as an addendum to the list of completed UserVoice suggestions in Visual Studio 2017 that we announced when Visual Studio 2017 released earlier this year. The list has 19 completed suggestions, totaling 1458 votes! Did you make any of these suggestions or vote for them? Now you get your votes back and we look forward to seeing you put them to good use again!

Remember, if you voted on completed items, your votes are returned so you can support other suggestions. And if you have not yet made a suggestion or voted, you can participate in the Visual Studio forum on UserVoice by selecting Help > Send Feedback > Provide a Suggestion.

We look forward to your feedback!
— Visual C++ Team


Join Microsoft at Supercomputing 17

$
0
0

We’re excited for the Supercomputing conference next week in Denver, Colorado. Supercomputing is where researchers and practitioners from around the world come together to advance the state of high-performance computing. Microsoft will have a great presence at SC17.

In booth 1501, we’ll show how Microsoft Azure provides one cloud for every workload. Our technical staff will be on hand to show how Azure’s offerings can help advance artificial intelligence, simulation, hybrid cloud, and cloud native applications. And of course, we’ll be ready to talk about our exclusive partnership with Cray that gives customers the ability to get dedicated access to a supercomputer on their Azure virtual network.

Also in the booth, we have some great talks lined up in the booth theater. Customers, partners, and Microsoft staff will give talks on how Azure is helping people solve real-world problems. Some talks include:

  • “GPU computing solving the most demanding challenges of large-scale computing” from Han Vanholder, of NVIDIA
  • “FEA simulations on Azure” from Rob Walsh, of Altair
  • “Azure specialized compute strategy” from Brett Tanzer, of Microsoft

Stop by booth 1501 for the latest schedule.


Microsoft is well-represented in the technical content, as well. Our presentations include:

We’re also proud to be a sponsor of the Student Cluster Competition, which will include a cloud component for the first time, along with tables at the Student Career Fair. We look forward to seeing you in Denver next week!

Visual Studio Team Services in Hong Kong

$
0
0

Today we opened a new VSTS instance in Hong Kong – the Azure “East Asia” region.  This adds to our instances in US, Europe, Brazil, India and Australia, giving us reasonably short distance access to every part of the world – except Africa.  We will continue to build out local support in new regions and make sure we have capacity around the globe.

You can read more details about the new instance and how to get started with it on our DevOps blog.

Brian

Creating a Minimal ASP.NET Core Windows Container

$
0
0

This is a guest post by Mike Rousos

One of the benefits of containers is their small size, which allows them to be more quickly deployed and more efficiently packed onto a host than virtual machines could be. This post highlights some recent advances in Windows container technology and .NET Core technology that allow ASP.NET Core Windows Docker images to be reduced in size dramatically.

Before we dive in, it’s worth reflecting on whether Docker image size even matters. Remember, Docker images are built by a series of read-only layers. When using multiple images on a single host, common layers will be shared, so multiple images/containers using a base image (a particular Nano Server or Server Core image, for example) will only need that base image present once on the machine. Even when containers are instantiated, they use the shared image layers and only take up additional disk space with their writable top layer. These efficiencies in Docker mean that image size doesn’t matter as much as someone just learning about containerization might guess.

All that said, Docker image size does make some difference. Every time a VM is added to your Docker host cluster in a scale-out operation, the images need to be populated. Smaller images will get the new host node up and serving requests faster. Also, despite image layer sharing, it’s not unusual for Docker hosts to have dozens of different images (or more). Even if some of those share common layers, there will be differences between them and the disk space needed can begin to add up.

If you’re new to using Docker with ASP.NET Core and want to read-up on the basics, you can learn all about containerizing ASP.NET Core applications in the documentation.

A Starting Point

You can create an ASP.NET Core Docker image for Windows containers by checking the ‘Enable Docker Support’ box while creating a new ASP.NET Core project in Visual Studio 2017 (or by right-clicking an existing .NET Core project and choosing ‘Add -> Docker Support’).

Adding Docker Support

To build the app’s Docker image from Visual Studio, follow these steps:

  1. Make sure the docker-compose project is selected as the solution’s startup project.
  2. Change the project’s Configuration to ‘Release’ instead of ‘Debug’.
    1. It’s important to use Release configuration because, in Debug configuration, Visual Studio doesn’t copy your application’s binaries into the Docker image. Instead, it creates a volume mount that allows the application binaries to be used from outside the container (so that they can be easily updated without rebuilding the image). This is great for a code-debug-fix cycle, but will give us incorrect data for what the Docker image size will be in production.
  3. Push F5 to build (and start) the Docker image.

Visual Studio Docker Launch Settings

Alternatively, the same image can be created from a command prompt by publishing the application (dotnet publish -c Release) and building the Docker image (docker build -t samplewebapi --build-arg source=bin/Release/netcoreapp2.0/publish .).

The resulting Docker image has a size of 1.24 GB (you can see this with the docker images or docker history commands). That’s a lot smaller than a Windows VM and even considerably smaller than Windows Server Core containers or VMs, but it’s still large for a Docker image. Let’s see how we can make it smaller.

Initial Template Image

Windows Nano Server, Version 1709

The first (and by far the greatest) improvement we can make to this image size has to do with the base OS image we’re using. If you look at the docker images output above, you will see that although the total image is 1.24 GB, the majority of that (more than 1 GB) comes from the underlying Windows Nano Server image.

The Windows team recently released Windows Server, version 1709. One of the great features in 1709 is an optimized Nano Server base image that is nearly 80% smaller than previous Nano Server images. The Nano Server, version 1709 image is only about 235 MB on disk (~93 MB compressed).

The first thing we should do to optimize our ASP.NET Core application’s image is to use that new Nano Server base. You can do that by navigating to the app’s Dockerfile and replacing FROM microsoft/aspnetcore:2.0 with FROM microsoft/aspnetcore:2.0-nanoserver-1709.

Be aware that in order to use Nano Server, version 1709 Docker images, the Docker host must be running either Windows Server, version 1709 or Windows 10 with the Fall Creators Update, which is rolling out worldwide right now. If your computer hasn’t received the Fall Creators Update yet, don’t worry. It is possible to create Windows Server, version 1709 virtual machines in Azure to try out these new features.

After switching to use the Nano Server, version 1709 base image, we can re-build our Docker image and see that its size is now 357 MB. That’s a big improvement over the original image!

If you’re building your Docker image by launching the docker-compose project from within Visual Studio, make sure Visual Studio is up-to-date (15.4 or later) since recent updates are needed to launch Docker containers based on Nano Server, version 1709 from Visual Studio.

AspNet Core v1709 Docker Image

That Might be Small Enough

Before we make the image any smaller, I want to pause to point out that for most scenarios, using the Nano Server, version 1709 base image is enough of an optimization and further “improvements” might actually make things worse. To understand why, take a look at the sizes of the component layers of the Docker image created in the last step.

AspNet Core v1709 Layers

The largest layers are still the OS (the bottom two layers) and, at the moment, that’s as small as Windows images get. Our app, on the other hand is the 373 kB towards the top of the layer history. That’s already quite small.

The only good places left to optimize are the .NET Core layer (64.9 MB) or the ASP.NET Core layer (53.6 MB). We can (and will) optimize those, but in many cases it’s counter-productive to do so because Docker shares layers between images (and containers) with common bases. In other words, the ASP.NET Core and .NET Core layers shown in this image will be shared with all other containers on the host that use microsoft/aspnetcore:2.0-nanoserver-1709 as their base image. The only additional space that other images consume will be the ~500 kB that our application added on top of the layers common to all ASP.NET Core images. Once we start making changes to those base layers, they won’t be sharable anymore (since we’ll be pulling out things that our app doesn’t need but that others might). So, we might reduce this application’s image size but cause others on the host to increase!

So, bottom line: if your Docker host will be hosting containers based on several different ASP.NET Core application images, then it’s probably best to just have them all derive from microsoft/aspnetcore:2.0-nanoserver-1709 and let the magic of Docker layer sharing save you space. If, on the other hand, your ASP.NET Core image is likely to be used alongside other non-.NET Core images which are unlikely to be able to share much with it anyhow, then read on to see how we can further optimize our image size.

Reducing ASP.NET Core Dependencies

The majority of the ~54 MB contributed by the ASP.NET Core layer of our image is a centralized store of ASP.NET Core components that’s installed by the aspnetcore Dockerfile. As mentioned above, this is useful because it allows ASP.NET Core dependencies to be shared between different ASP.NET Core application Docker images. If you have a small ASP.NET Core app (and don’t need the sharing), it’s possible to just bundle the parts of the ASP.NET Core web stack you need with your application and skip the rest.

To remove unused portions of the ASP.NET Core stack, we can take the following steps:

  1. Update the Dockerfile to use microsoft/dotnet:2.0.0-runtime-nanoserver-1709 as its base image instead of microsoft/aspnetcore:2.0-nanoserver-1709.
  2. Add the line ENV ASPNETCORE_URLS http://+:80 to the Dockerfile after the FROM statement (this was previously done in the aspnetcore base image for us).
  3. In the app’s project file, replace the Microsoft.AspNetCore.All metapackage dependency with dependencies on just the ASP.NET Core components the app requires. In this case, my app is a trivial ‘Hello World’ web API, so I only need the following (larger apps would, of course, need more ASP.NET Core packages):
    1. Microsoft.AspNetCore
    2. Microsoft.AspNetCore.Mvc.Core
    3. Microsoft.AspNetCore.Mvc.Formatters.Json
  4. Update the app’s Startup.cs file to callservices.AddMvcCore().AddJsonFormatters() instead of services.AddMvc() (since the AddMvc extension method isn’t in the packages we’ve referenced).
    1. This works because our sample project is a Web API project. An MVC project would need more MVC services.
  5. Update the app’s controllers to derive from ControllerBase instead ofController
    1. Again, since this is a Web API controller instead of an MVC controller, it doesn’t use the additional functionality Controller adds).

Now when we build the Docker image, we can see we’ve shaved a little more than 40 MB by only including the ASP.NET Core dependencies we need. The total size is now 315 MB.

NanoServer No AspNet All

Bear in mind that this is a trivial sample app and a real-world application would not be able to cut as much of the ASP.NET Core framework.

Also, notice that while we eliminated the 54 MB intermediate ASP.NET Core layer (which could have been shared), we’ve increased the size of our application layer (which cannot be shared) by about 11 MB.

Trimming Unused Assemblies

The next place to consider saving space will be from the .NET Core/CoreFX layer (which is consuming ~65 MB at the moment). Like the ASP.NET Core optimizations, this is only useful if that layer wasn’t going to be shared with other images. It’s also a little trickier to improve because unlike ASP.NET Core, .NET Core’s framework is delived as a single package (Microsoft.NetCore.App).

To reduce the size of .NET Core/CoreFX files in our image, we need to take two steps:

  1. Include the .NET Core files as part of our application (instead of in a base layer).
  2. Use a preview tool to trim unused assemblies from our application.

The result of those steps will be the removal of any .NET Framework (or remaining ASP.NET Core framework) assemblies that aren’t actually used by our application.

First, we need to make our application self-contained. To do that, add a <RuntimeIdentifiers>win10-x64</RuntimeIdentifiers> property to the project’s csproj file.

We also need to update our Dockerfile to use microsoft/nanoserver:1709 as its base image (so that we don’t end up with two copies of .NET Core) and useSampleWebApi.exe as our image’s entrypoint instead of dotnet SampleWebApi.dll.

Up until now, it was possible to build the Docker image either from Visual Studio or the command line. But Visual Studio doesn’t currently support building Docker images for self-contained .NET Core applications (which are not typically used for development-time debugging). So, to build our Docker image from here on out, we will use the following command line interface commands (notice that they’re a little different from those shown previously since we are now publishing a runtime-specific build of the application). Also, you may need to delete (or update) the .dockerignore file generated as part of the project’s template because we’re now copying binaries into the Docker image from a different publish location.

dotnet publish -c Release -r win10-x64
docker build -t samplewebapi --build-arg
   source=bin/Release/netcoreapp2.0/win10-x64/publish .

These changes will cause the .NET Core assemblies to be deployed with our application instead of in a shared location, but the included files will be about the same. To remove unneeded assemblies, we can use Microsoft.Packaging.Tools.Trimming, a preview package that removes unused assemblies from a project’s output and publish folders. To do that, add a package reference to Microsoft.Packaging.Tools.Trimming and a <TrimUnusedDependencies>true</TrimUnusedDependencies> property to the application’s project file.

After making those additions, re-publishing, and re-building the Docker image (using the CLI commands shown above), the total image size is down to 288 MB.

NanoServer SelfContained Trim Dependencies

As before, this reduction in total image size does come at the expense of a larger top layer (up to 53 MB).

One More Round of Trimming

We’re nearly done now, but there’s one more optimization we can make.Microsoft.Packaging.Tools.Trimming removed some unused assemblies, but others still remain because it isn’t clear whether dependencies to those ones assemblies are actually exercised or not. And that’s not to mention all the IL in an assembly that may be unused if our application calls just one or two methods from it.

There’s another preview trimming tool, the .NET IL Linker, which is based on the Mono linker and can remove unused IL from assemblies.

This tool is still experimental, so to reference it we need to add a NuGet.config to our solution and include https://dotnet.myget.org/F/dotnet-core/api/v3/index.json as a package source. Then, we add a dependency to the latest preview version of ILLink.Tasks(currently, the latest version is 0.1.4-preview-981901).

ILLink.Tasks will trim IL automatically, but we can get a report on what it has done by passing /p:ShowLinkerSizeComparison=true to our dotnet publish command.

After one more publish and Docker image build, the final size for our Windows ASP.NET Core Web API container image comes to 271 MB!

NanoServer Double Trim

Even though trimming ASP.NET Core and .NET Core Framework assemblies isn’t common for most containerized projects, the preview trimming tools shown here can be very useful for reducing the size of large applications since they can remove application-local assemblies (pulled in from NuGet, for example) and IL code paths that aren’t used.

Wrap-Up

This post has shown a series of optimizations that can help to reduce ASP.NET Core Docker image size. In most cases, all that’s needed is to be sure to use new Nano Server, version 1709 base images and, if your app is large, to consider some preview dependency trimming options like Microsoft.Packaging.Tools.Trimming or the .NET IL Linker.

Less commonly, you might also consider using app-local versions of the ASP.NET Core or .NET Core Frameworks (as opposed to shared ones) so that you can trim unused dependencies from them. Just be careful to keep common base image layers unchanged if they’re likely to be shared between multiple images. Although this article presented the different trimming and minimizing options as a series of steps, you should feel free to pick and choose the techniques that make sense for your particular scenario.

In the end, a simple ASP.NET Core web API sample can be packaged into a < 360 MB Windows Docker image without sacrificing any ability to share ASP.NET Core and .NET Core base layers with other Docker images on the host and, potentially, into an even smaller image (271 MB) if that sharing is not required.

New offers in Azure Marketplace – October 2017

$
0
0

The Azure Marketplace continues to grow rapidly. We added 38 great new offers in October. Check them out!

image Puppet Enterprise 2017.2:  Puppet Enterprise lets you automate the entire lifecycle of your Azure infrastructure, simply, scalably, and securely, from initial provisioning through application deployment. Learn more here.
image ZeroDown Software Tool:  The migration tool enables companies to move applications from their data center or other hosting platforms to Azure. The migration tool prompts users to identify the URL for the Source Application and Target Application. Learn more here.
image Avi Vantage ADC Platform for Microsoft Azure: Avi Vantage delivers enterprise grade Elastic Load Balancer with SSL offload, web application security and Real-Time application performance monitoring and predictive autoscaling for applications for optimal application sizing. Learn more here.
image ROKMAPS: ROKMAPS™ is a robust application that allows organizations to view their GIS data on any device. Learn more here.
image Parity Ethereum PoA:  For private chain setups, Parity supports a Proof-of-Authority (PoA) consensus engine to be used as a replacement for the Proof-of-Work (PoW) consensus engine. Learn more here.
image Cloud Dev Environment: GENESYS CLOUD DEV ENVIRONMENT is a pre-setup Cloud Infrastructure + Open .NET Stack, for developers and teams to rapidly provision a working dev environment in 15 minutes and start coding. Learn more here.
image BlueCat DNS for Azure: Powerful, software-driven DNS capabilities for Microsoft Azure Cloud, enabling the same level of agility, secure operation, simplification, and dependability that our customers have come to expect on their enterprise-class networks running BlueCat DNS Integrity solutions. Learn more here.
image Aviatrix OpenVPN Server - 25 User:  Aviatrix OpenVPN™ Access Server is Cloud Native (built for Azure) software solution to enable SSL Secure Remote Access services to Azure VNETs. Learn more here.
image FlashGrid Node for Oracle RAC: The VM image contains FlashGrid software and can be used for creating Oracle Real Application Clusters nodes. FlashGrid-enabled clusters have high availability and performance characteristics required for running mission-critical enterprise databases. Learn more here.
image Jamcracker Hybrid Cloud Management: Jamcracker Cloud Management solutions go beyond basic multi-cloud management and address various challenges the organizations face while implementing cloud services. Learn more here.
image NetConnect: Simple, fast and secure access to cloud servers and applications from your device of choice. Learn more here.
image AtScale Intelligence Platform: AtScale makes BI work on Big Data. With AtScale, business users get interactive and multi-dimensional analysis capabilities directly on HDInsight. Learn more here.
image Netgate pfSense® Firewall/VPN/Router: pfSense® is the world’s leading open-source firewall software. With over 1 million active installations, enterprise-level organizations, higher education institutions, and government agencies around the world rely on pfSense software to provide dependable, full-featured firewall protection in the cloud. Learn more here.
image Opsview Monitor: Opsview Monitor is a unified monitoring platform. It consolidates reporting and alerting for complex IT infrastructure and applications. Learn more here.
image Aviatrix OpenVPN Server - 100 User: Aviatrix OpenVPN™ Access Server is Cloud Native (built for Azure) software solution to enable SSL Secure Remote Access services to Azure VNETs. Learn more here.
image Aviatrix OpenVPN Server - 50 User: Aviatrix OpenVPN™ Access Server is Cloud Native (built for Azure) software solution to enable SSL Secure Remote Access services to Azure VNETs. Learn more here.
image Aviatrix OpenVPN Server - 10 User: Aviatrix OpenVPN™ Access Server is Cloud Native (built for Azure) software solution to enable SSL Secure Remote Access services to Azure VNETs. Learn more here.
image MaxParallel for SQL Server 2017: MaxParallel software for SQL Server significantly shortens your time to process transactions, generate reports and analyze trends. Learn more here.
image VIDIZMO Digital Asset Management (DAM): VIDIZMO DAM provides a centralized repository for a numerous variety of digital media assets now managed by almost all contemporary business organizations. Learn more here.
image VIDIZMO EnterpriseTube: Recognized as a challenger in Gartner’s enterprise video magic quadrant, VIDIZMO EnterpriseTube delivers the ultimate flexibility, scalability, and reliability for quality live and on-demand video streaming solutions. Learn more here.
image VIDIZMO Virtual Academy: Use live and on-demand streaming video to conduct highly engaging, collaborative and interactive training and learning activities for employees at all echelons of an organization. Learn more here.
image RSK Ginger Testnet Node: This is a 1-click node setup. No additional steps are required after launching the instance. Use SSH to connect to the operating system. Learn more here.
image Riverbed SteelCentral Controller for SH Mobile 5.5: SteelHead Mobile is a great way to give application acceleration and optimization to mobile laptops and remote desktops. And to manage all those far-flung SteelHead Mobile deployments, turn to SteelCentral Controller for SteelHead Mobile. Learn more here.
image ERC-20 Token Service on Quroum: This sample environment provides RESTful services for creating and managing ERC-20 tokens on top of Quorum. Learn more here.
image Signal Sciences Web Protection Platform (WAF+RASP): Signal Sciences provides security, visibility, and protection for web apps, microservices, and APIs. Learn more here.
image IBM HTTP Server: Install Your IBM HTTP Server and WebSphere Plugins with just a few clicks, licensed and ready to go! By using this bundle You are renting Your IBM HTTP Server license on an hourly basis. You get Your support from MidVision, a Premier IBM Business partner. Learn more here.
image Teradata QueryGrid Manager (IntelliSphere): Provides a federated query capability allowing users to access and query data in remote servers. Teradata QueryGrid (IntelliSphere) provides a federated query capability allowing users to access and querydata in remote servers that are part of the Teradata QueryGrid data fabric. Learn more here.
image Trade Finance – Letter of Credit: This is a blockchain application built on Ethereum. It illustrates the concept that an immutable fact pattern of all documents and data related to an import order can be maintained on a blockchain. Learn more here.
image FogSM Basic Service: The basic version provides a basic CentOS OS with selected software packages that is tested and packaged by Nebbiolo Technologies. This basic version is hardened for security and provides the foundation for the other services offered by Nebbiolo Technologies. Learn more here.
image anynode - The Software SBC: A Session Border Controller is used if signaling and media flows between two separate VoIP-networks need to be established, transmitted and terminated. Learn more here.
image WebSphere Application Server Base Edition and MQ: This is a pre-configured image of IBM® WebSphere Application Server offers options for a faster, more flexible Java application server runtime environment, and IBM® MQ, IBM's market-leading messaging integration. Learn more here.
image Azure to AWS or GCloud Gateway - 3 S2S Connections: Fully automated Point and Click Networking solution to connect Azure to AWS or GCloud. Learn more here.
image VIDIZMO Digital Asset Management (DAM): VIDIZMO DAM provides a centralized repository for a numerous variety of digital media assets now managed by almost all contemporary business organizations. Learn more here.
image VIDIZMO Digital Evidence Management (DEM) HA:  A consolidated portal for end-to-end digital evidence management of all video & other digital media. Learn more here.
image VIDIZMO EnterpriseTube HA: Consolidated video platform serving enterprise needs for live & on-demand video streaming solutions. Learn more here.
image VIDIZMO Virtual Academy HA: A corporate video training & learning platform for video streaming & digital asset management. Learn more here.
image vSRX Security Gateway: Easily extend your existing on prem networks to the Azure cloud with the vSRX Sec Gateway solution. Learn more here.
image VeloCloud Virtual Edge (3.x): VeloCloud virtual edge appliance to extend SD-WAN network to customer's cloud infrastructure. Learn more here.

Accelerating the adoption of enterprise blockchain

$
0
0

Over the past year, we have had the pleasure of working with several customers on their business initiatives related to blockchain's technology. During this time, we have been helping them to envision business scenarios, choose the right blockchain protocols and distributed ledgers, and most importantly, develop pilots focused on validating the technology capabilities to provide real value to their organizations.

When to use blockchain?

From the technical perspective, we could apply blockchain to many scenarios. However, not every situation requires blockchain, but there are some scenarios where it creates significant value compared to alternative technologies. Usually, these are shared business processes, with the organization in different industries such as financial services, manufacturing, or retail.

The recommendation to recognize blockchain scenarios is to make sure you are using the core capabilities of blockchain. You should respond positively to four key questions (see below), and there should be a real business case with measurable outcomes. If this is not the case, please consider using other more mature technologies.

Blockchain capabilities

The blockchain is one of the top emerging technologies revolutionizing today’s business models. Fundamentally, blockchain enables participants to exchange value without the need for intermediaries.

Blockchain capabilities

But what is blockchain exactly? And what capabilities make it so attractive for enterprises? The blockchain is a disruptive technology trend that enables a shared, authentic, decentralized ledger:

  • SECURE: Blockchain uses strong cryptography to create transactions that are impervious to fraud and establishes a shared truth. Also, all the transactions are signed with the digital certificate.
  • SHARED: The real benefits of blockchain, over conventional technology, are achieved when we use it to link organizations to share information on a distributed ledger.
  • DISTRIBUTED: A blockchain can be distributed across multiple organizations and becomes more secure as replicas are added.
  • LEDGER: Every transaction is written into the ledger once and cannot be changed after the fact.

Questions to be answered before developing a blockchain solution

Answering the following four questions can determine if blockchain is appropriate for the identified business scenario.

  • Do multiple parties share data?
  • Do multiple parties update data?
  • Is there a requirement for verification?
  • Can intermediaries be removed and reduce cost and complexity?

If you answered yes to all of these questions, then you have a potential scenario to apply blockchain.

Public blockchain vs. enterprise blockchain

A public blockchain (i.e., Bitcoin or Ethereum) is an Internet protocol managing the distribution of potential unique data with the following characteristics:

  • Many, anonymous, or pseudonymous participants
  • Open read and write by all participants
  • Consensus by proof of work

Too often organizations fail with blockchain because they try to use public blockchain networks, or their rules, for their enterprise solutions. Instead, the organization should consider the use of Enterprise Blockchain.

What do we mean with enterprise blockchain? An enterprise blockchain (i.e., Hyperlegder, Ethereum Enterprise, Ripple, Quorum, etc.) is a distributed ledger with the following characteristics:

  • All the participants, and their digital identities, are known from one or many trusted organizations
  • Writes and read permissions are roles-based and usually requires consensus of several participants
  • Multiple algorithms are used for consensus

You should know that we have two types of enterprise blockchain:

  • Private: Usually managed by a single organization. Typically, the network participants are internal business units or divisions. 
  • Consortium: In this case, the blockchain network is managed by multiple trusted organizations. New participants require a consensus of several participants.

Industries using blockchain

The potential impact of blockchain is significant across all sectors and industries—from banking to government to healthcare and beyond:

  • Eliminates intermediaries increasing efficiency and speed.
  • Simplifies operations by reducing cost and time related to reconciliations and disputes.
  • Potentially enables new business models increasing revenue and savings.

According to top market analysts and leading consulting firms, the top five industries that blockchain will likely disrupt by 2020 are financial services, government, real estate, supply chain management, and media distribution.

Reviewing our historical data, we also see that close to 80% of the customers using blockchain in Microsoft Azure are also financial services institutions, including insurance companies. However, as you can see in the following figure, the trend is changing if we just consider the existing engagement and pipeline.

Industries

On the other hand, double checking the solutions related to banking & capital markets, we had discovered that 60% of the blockchain implementations involved at least one participant from a second industry such as manufacturing, government, or retail. 

Common business patterns 

Just in the last 12 months, we were able to count 76 scenarios across seven industries. The good news is that based on our engagements with customers and partners, we also were able to reduce them in the following eight business patterns.

Common business patterns

By exploring business patterns, our customers can now learn how blockchain can promote operational simplification, reduce the role of intermediaries, and potentially enable new business models. For each pattern, we provide an overview of our common needs and challenges, the potential benefits from applying blockchain, key near-term milestones for initial blockchain applications, and an use case sequence diagram. Using this approach is helping us to define reference architectures, and developing IP to reduce the time-to-market of blockchain solutions.

As an illustrated example, below I’m sharing current blockchain applications and some public references:

Recommended engagement model

Based on our previous engagements, our experience delivering innovative projects and using our agile approach, we are now ready to help customers understand the potential impact of blockchain technology on their industries, as well as determine whether it is capable of delivering both cost efficiencies and competitive advantage for them.

Microsoft Services provides offers to help our customers have an improved understanding of blockchain, explore the potential of this technology through business scenarios and implement a Minimum Viable Product (MVP) based on Microsoft Azure.

Engagement model

Supported protocols

To finish, let us share our answer to the most frequently asked questions by our customers. Does Microsoft have its own blockchain ledger? The answer is NO.

Microsoft has been working on blockchain since November 2015 when we were the first major cloud provider to announce a Blockchain as a Service (BaaS). Our vision is to be the worldwide cloud platform leader powering the blockchain-based applications.

Microsoft is working with customers, partners, and the developer community to accelerate blockchain’s enterprise readiness. Our mission is to help companies thrive in this new era of secure multi-party collaboration by delivering open, scalable platforms and services that any company can use to improve shared business processes. Our roadmap is based on the following principles:

  • Blockchain on your terms: No one-size-fits-all approach — Microsoft’s platform and ecosystem partners make it easy to get started and iterate quickly with the blockchain of your choice, both on-premises and in the cloud.
  • Integrated with your business: Merge blockchain with the IT assets you already have. Azure lets you integrate blockchain with the cloud services your organization uses to power shared processes.
  • Ready for the enterprise: With the Coco Framework, Cryptlets, and our Azure services integrations, Microsoft is addressing existing technology gaps with blockchain and helps organizations build durable enterprise-grade applications.

Our active participation in industry consortiums such as R3, Enterprise Ethereum Alliance, and IC3, also help us to understand core industry scenarios, and to continue learning to meet the needs of our customers.

Currently, Microsoft supports the most widely used blockchain and distributed ledger protocols on Azure, including HyperLedger Fabric, R3 Corda, Quorum, Chain Core, and BlockApps

For more information, please contact your Microsoft Services representative and ask for the BLOCKCHAIN ESSENTIALS Offer. You can also visit www.microsoft.com/services.

Visual Studio IDE extensions now published and managed at Marketplace

$
0
0

Consumers of Visual Studio IDE extensions visit Visual Studio Marketplace to discover and acquire extensions. But extension publishers visit Visual Studio Gallery to publish and manage their Visual Studio IDE extensions. Henceforth, extension publishing and management will also be in Marketplace. In this process, all extensions in Gallery will be automatically moved over to Marketplace. Marketplace now serves as a single place for extension publishers and consumers to publish, manage or acquire extensions.

Benefits of publishers moving to Marketplace

  1. You can have multiple people manage your extension and you can control what they can and cannot do via the role granted to them.
  2. Multiple people with varied roles and permissions
  3. As a publisher you can view the all-round usage, ratings + reviews, and Q&A for all your extensions and then take suitable actions via the extension reporting hub. Read more here.
  4. View usage ratings reviews and QA for all your extensions
  5. You can also create multiple publisher profiles and list different extensions under appropriate profile. For example, profile 1 can be used to manage your organization level extensions while profile 2 can be leveraged to list and manage your personal extensions.
  6. Create multiple publisher profiles

 

We are rolling out extension publishing in phases. You can publish and manage new extensions in Marketplace right away. However, editing an existing extension (i.e. those extensions initially uploaded to Gallery) will be enabled in Marketplace in next few weeks.

To publish and manage a new extension in Marketplace

  1. Build an extension by following the process laid out here.
  2. Visit Visual Studio Marketplace and click “Publish extensions” in top right. It may ask you to sign in using your Microsoft account and you may be asked to create a publisher profile in case you don’t already have one.
  3. Publish Extensions button in top right corner of Marketplace
  4. Upload a new extension via
  5. Upload a new extension
  6. Subsequently manage the new extension via its context menu options
  7. Manage the extension via its right click context menu options

We hope you like above changes. And look forward to hearing from you at vsmarketplace@microsoft.com or on Gitter.

Harish Kumar Agarwal, Senior Program Manager, Visual Studio

Harish is on the Marketplace team and tries to ensure that publishers and consumers of extensions for Visual Studio products are able to publish, manage or acquire extensions easily.

Don Jayamanne, creator of the Python extension for Visual Studio Code, joins Microsoft

$
0
0

I'm delighted to announce that Don Jayamanne, the author of the most popular Python extension for Visual Studio Code, has joined Microsoft! Starting immediately, Microsoft will be publishing and supporting the extension. You will receive the update automatically, or visit our Visual Studio Marketplace page and click "Install".

Python has a long history at Microsoft, starting with IronPython, Python For Visual Studio, the Python SDK for Azure, Azure Notebooks as well as contributing developer time and support to CPython.

While some people like using a full-featured IDE such as Visual Studio, others prefer to have a lighter weight, editor-centric experience such as Visual Studio Code. Don has been working on the Python extension part-time for the past couple of years. We were impressed by his work and have hired him to work on it full-time, along with other members of our Python team.

What does Microsoft Python team publishing the extension mean?

For all practical purposes the transition should be transparent to you. Additionally:

  • The extension will remain open source and free
  • Development will continue to be on GitHub, under the existing license
  • More dev resources means (generally) faster turnaround on bug fixes and new features
  • Official support will be provided by Microsoft

For this first release we're focusing on fixing a number of existing issues and adding a few new features such as multi-root and status bar interpreter selection (a complete list of changes can be found in the changelog).

Note: If for whatever reason you would prefer to continue to use the extension as Don released it prior to joining Microsoft, you can uninstall the Python extension and then download the VSIX file and install it manually. Note that no further development will be done in the old GitHub repo.

We're hiring for Visual Studio Code / Python!

We're hiring devs immediately to continue and expand work on our Python support for Visual Studio Code. If you are passionate about developer tools and productivity, this could be an ideal endeavor! The ideal candidate has a mix of IDE (editor/debugger), JavaScript/TypeScript, and Python in their background, and experience writing plugins for VS Code is a big plus. If you bring something really special to the table we can consider remote, but ideally you would plan to relocate to our Redmond offices. If interested, please send your resume to pythonjobs@microsoft.com with the subject: "VSC-Python".


Visual Studio Code C/C++ extension Nov 2017 update – Multi-root workspaces support is here!

$
0
0

This week has been very exciting for the Visual Studio Code C/C++ extension! It crossed 4 million downloads earlier this week only 18 months after its first release! Today, we are shipping the November 2017 update, which enables the extension to work with multi-root workspaces seamlessly, making VS Code an even more powerful C/C++ development environment.

Multi-root workspaces support

VS Code 1.18 and later provides multi-root workspaces support, which allows users to work with multiple project folders in VS Code. This can be very useful when you are working on several related projects at the same time. With the November update, the C/C++ extension now provides C/C++ IntelliSense, code browsing, and debugging support for each opened folder independently. Follow the instructions in the VS Code Multi-root Workspaces document to add folders to your workspace. The screenshot below shows two folders opened in VS Code, each has its own c_cpp_properies.json file defined:

The IntelliSense engine retrieves include paths and defines information from the c_cpp_properties.json file located in the .vscode folder in each opened folder to provide all the IntelliSense and code browsing features, including member list, parameter hints, go to definition, etc. Each folder operates independently from other folders.

For example, in the screenshot below, I have two Calculator.cpp files opened in VS Code, one from the “Calculator” folder and the other one from the “Fancy Calculator” folder. Each file has a Calculator class defined, but with different functions. After typing “c.”, the member list pops up in each file shows the expected member functions that match the definition of the class that lives in the current context (i.e. the folder that the editing document belongs to).

Compiler-based IntelliSense is the new default

By now you may have noticed an improved out-of-box IntelliSense experience. 😊 The compiler-based IntelliSense engine is now the default engine to provide better and more accurate results to some of the key IntelliSense features, including auto-complete suggestions for class/struct/namespace members, quick info tooltips, error squiggles, reference highlighting, and parameter hints. If you are curious, this blog post talks about how it works and how to control the behavior.

Tell us what you think

Download the C/C++ extension for Visual Studio Code, try it out and let us know what you think. File issues and suggestions on GitHub. Provide us feedback in this quick survey to help shape this extension for your needs. You can also find us on Twitter (@VisualC).

Application Insights: VSTS dashboard chart widget now available

$
0
0

Yes, you read that correctly. We have just released an Application Insights chart widget for VSTS dashboards!

It's ok, you can go ahead and say it, this has been a long time coming. We heard all of the fans from our previous metric widget very clearly that this was something almost everyone wanted. But for quite awhile we also had other, even-higher-priority things that our user base really wanted and needed (you know, little things like bringing the product to general availability, and developing the whole Usage Analytics functionality area). The good news is that we never forgot about all of the asks for this feature set and it's finally here.

How do I get the new widget?

If you already have our previous metric widget installed, you don't have to do anything. You should already have the new chart widget available. If you're starting from scratch you can download the Application Insights widget​ from the Visual Studio marketplace.

How does the new widget work?

Simply click the edit button on your VSTS dashboard, then click on the add (+) button. You'll see a list of available tiles, including the previously available Application Insights metric widget as well as the new chart widget:

Add a new chart widget

Once the widget has been added to your dashboard, open the configuration, provide the application ID and API key, and then select the metric you want to track with the widget as well as particulars like time range and granularity. When you're finished you'll get a lovely chart that will look something like this:

Application Insights VSTS dashboard chart widget

That's it. You can choose from line (default), area, or column charts as appropriate for the type of data you intend to return. The chart is also resizable and more detail is added as the size increases. So, you can get a small summary chart or a large detailed chart complete with timestamped axis labels. You can always hover over any data point to get a tooltip that will display the detailed data.

The current chart widget does not provide threshold warnings the way that the metric widget does. This is on purpose for this first iteration, as there was some debate as to how we should determine when a threshold has been crossed. For instance, if you're displaying average request execution time (as in the example graphic above), do you want to show a critical/warning condition if any of the values in the average cross that threshold or is it if the average value itself exceeds the limit? Do you only trigger the condition if it's currently exceeding the limit or should it display if any value in the time range of the chart meets the condition? The possibilities are much more complex than with the metric widget, so we have not added this functionality yet to avoid confusion. If you have specific feedback on this feature, please let us know. We always welcome your input.

Next Steps

As mentioned before, we will consider adding the threshold feature to this widget so long as we feel that we can establish logical parameters that make sense for the majority of the user base. Additionally, we are still looking into an Application Analytics (query-based) widget as several users have mentioned that this would be useful as well. Please go upvote these features if they are important to you.

We have some more great devops functionality coming your way soon. Thanks for reading!

Grez

_______________

As always please share your ideas for new or improved features on the Application Insights UserVoice page. For any questions visit the Application Insights Forum.

Announcing automatic OS upgrades for Azure VM scale sets

$
0
0

Infrastructure scale and complexity

When Azure virtual machine scale sets were introduced, we made it easy to deploy and manage cloud infrastructure at very large scale. We are continually looking for ways to lower the overall cost of managing cloud infrastructure. One area of cost that increases with scale is maintaining each virtual machine patched to the latest OS version, while keeping your applications running. This means finding ways to smoothly roll out an upgrade, one batch of VMs at a time. Many OS providers create new patched versions of their VM images at a monthly cadence, or more frequently for critical security fixes, but checking for, and manually rolling out these upgrades adds to cloud infrastructure management cost.

Announcing new automation features for scale sets

Today we are announcing a new set of automation features to simplify OS image upgrades for scale sets, allowing you to adopt a set-it-and-forget-it approach to OS lifecycle maintenance.

Automatic OS image upgrade preview

Now in preview, the automatic OS image upgrade feature for Azure scale sets will automatically upgrade all VMs in your scale set to the latest version. Once configured, the latest OS image published by image publishers will automatically be applied to the scale set without user intervention.

To minimize application downtime, upgrades take place in batches of machines, with no more than 20% of the scale set upgrading at any time. You also have the option to integrate an Azure Load Balancer application health probe. This is highly recommended to incorporate an application heartbeat and validate upgrade success for each batch in the upgrade process.

To maintain consistency and adherence to the latest OS versions across your applications, it’s also possible (and recommended) to configure an Azure Resource Manager policy, which enforces automatic image upgrade for all the scale sets in your subscription.

An upgrade works by replacing the OS disk of a VM with a new one created using the latest image version. Any configured extensions and custom data scripts are run, while persisted data disks are retained. You can opt out of automatic upgrades at any time, or manually initiate an upgrade.

Note: There are no restrictions on virtual machine or scale set size, and auto-OS upgrade works for both Windows and Linux VMs.

Which operating systems are eligible for automatic OS upgrade?

Initially the following operating systems are eligible for the automatic OS upgrade preview:

  • Windows Server 2016 Datacenter
  • Windows Server 2012 Datacenter R2
  • Ubuntu Server 16.04-LTS

We’ll be adding more OS families as we go along.

When does an automatic OS upgrade happen?

Automatic OS upgrades are triggered shortly after the publisher of the OS releases a new image version.

How does automatic image upgrade differ from in-VM upgrade options like Windows Update?

Image upgrades replace the operating system disk of each VM, one batch at a time. This works well for stateless applications that do not keep application state on the OS disk, or applications which keep redundant copies of state data. The new images are publisher maintained and certified. Updates are applied using Safe Deployment Practices across Azure regions, for example, and do not happen in geo-paired regions at the same time.

In-VM upgrade options like “Windows Update” apply operating system patches without replacing the OS disk. This works well for stateful applications which keep application state on the OS disk, but also risks application downtime because the patch can be applied to all VMs at the same time. Also, the virtual machine OS disks become increasingly further ahead of the scale set’s source image, which means new VMs (for example when a scale set scales out) will go through a longer patch cycle when they are created.

Getting started

To learn more about how to register your subscription for the preview and to get started, please read Azure virtual machine scale set automatic OS upgrades.

There you will find instructions to configure a load balancer health probe, enforce an upgrade policy, check the status of your upgrade, and example Azure templates to get you started.

Windows System State Backup to Azure with Azure Backup is generally available

$
0
0

We are excited to announce the general availability (GA) of Windows Server System State Backup to Azure with the Azure Backup agent. We previewed the direct offsite of Windows Server System State to Azure using the Azure Backup agent earlier this year. This was a key addition to Azure Backup agent’s existing capability of backing up files and folders directly to Azure. With this GA release, the Azure Backup agent has full production support for protecting Windows File Servers, Active Directory, and IIS Web servers hosted on Windows Server 2016 all the way back to Windows Server 2008 R2. Backing up your Windows Server System State to Azure gives you a simple, secure and cost-effective way of protecting Windows Servers and enabling the recovery of dynamic OS and application configuration from Azure in the event of an IT disaster.

New features

  • Flexible backup schedule and retention policy for System State
    Now you can configure daily backups for System State at your preferred time directly from the Azure backup agent console. You can also set retention ranges for your daily, weekly and monthly system state backups. These options put you in control of managing your data.
  • Automation at scale with PowerShell
    Full PowerShell support for configuration, backup and recovery of System State so that you can automate protection of Windows Server files and configuration at scale.

Windows-Server-System-State-Backup-Azure-Backup

Benefits of System State Backup with Azure Backup

  • Comprehensive protection for Active Directory, File-Servers and IIS Web servers
    System State fully encapsulates Active Directory, which is the most important database in any organization and allows for targeted domain-controller recoveries. In addition, critical cluster information of File Servers and the IIS Web Server Metabase is fully contained in the Windows System State.
  • Centralized management in Azure
    Once it is backed up, all information related to System State backups across your Windows Servers is available in the Azure portal. You can also configure notifications directly from the Azure portal so you get notified of a failed backup and you can take corrective steps. You can also generate reports using Microsoft Power BI.
  • Cost-effective and secure offsite storage for Windows Server
    With pay-as-you-go Azure storage, Azure Backup eliminates on-premises infrastructure by directly backing up your Windows Server System State to Azure. Azure Backup also encrypts your backups at the source using a key that only you have access to. Additionally, enhanced security features built into Azure Backup ensure that your critical system state backups remain secure from ransomware, corruptions, and deletions.
  • Free restores
    With Azure Backup, you can restore System State files from Azure without any egress charges.

Getting started

 

Follow the four simple steps below to start protecting Windows Servers using Azure Backup.

  1. Create an Azure Recovery Services Vault in the Azure portal
  2. Download the latest version of the Azure Backup Agent to your on-premises Windows Servers from the Azure Portal
  3. Install and Register the Agent to your Recovery Services Vault in Azure
  4. Start protecting Windows Server System State and other Files and Folders directly to Azure!

Related links and additional content

Monitor Azure services and applications using Grafana

$
0
0

Today, we are excited to introduce the Grafana plugin for Azure Monitor and Application Insights. Azure is an open platform that enables you to bring workloads built using your favorite tools and frameworks, and host them alongside a wide variety of services in Azure. As you continue your journey to the cloud, onboarding your applications to Azure, many of you expressed the need to leverage your existing open source devops and management tools for business continuity. Our mission is to meet you where you are and enable you to seamlessly leverage these tools in Azure. One popular tool is Grafana, a leading open source software to visualize your time series metrics. Several of you provided feedback on the ability to consume metrics from Azure services and applications in Grafana, which is part of your existing monitoring investments. We have some great news – you can now use the Grafana data source plugin for Azure to do this.

Grafana plugin for Azure Monitor and Application Insights preview

If you are already using Grafana, you can now use it to monitor Azure services and applications too, thanks to the new Azure Monitor data source plugin, built by the team at Grafana Labs, the company behind Grafana. This plugin, currently available in preview, enables you to all include metrics from Azure Monitor and Application Insights in your Grafana dashboards. Please visit the data source plugin installation page to review the setup instructions.

Contoso Loans Grafana dashboard

Figure 1: A Grafana dashboard that shows metrics from Azure Monitor and Application Insights

With this plugin, you can now bring together all the metrics collected from a variety of open source agents such as Collectd, Telegraf, and Prometheus, and metrics from Azure platform and applications into one dashboard.

Easy Grafana setup via Azure Marketplace offering

The team at Grafana Labs wanted to make this easier for you to get started with Grafana on Azure and therefore built an Azure Marketplace offer. You can leverage the Grafana server in Azure Marketplace, which is essentially a VM with pre-installed Grafana dashboard server and the Azure plugin.

A sample walkthrough to setup a Grafana dashboard with Azure Monitor and Application Insights metrics can be found in the Azure Monitor Documentation.

The team at Grafana Labs is excited to hear from you on the plugin and the Marketplace offer. You can reach out to them via on Twitter @grafana or submit your feedback in GitHub.

Wrapping up

We will continue to empower you to do more by innovating our platform and services, as well as integrating them with open source framework and tools you love by partnering with the community. The Grafana integration is just one of the many integrations yet to come. We are excited to hear your feedback after you try this on your applications and resources in Azure.

Microsoft Cosmos DB in Azure Storage Explorer – public preview

$
0
0

We are happy to announce the public preview of support for Cosmos DB in the Azure Storage Explorer (ASE). With this release Cosmos DB databases can be explored and managed with the same consistent user experiences that make ASE a powerful developer tool for managing Azure storage.

The extension allows you to manage Cosmos DB entities, manipulate data, create and update stored procedures, triggers, as well as User Defined Functions. Azure Storage Explorer not only offers unified developer experiences for inserting, querying, and managing your Azure Cosmos DB data, but also provides an editor with syntax highlighting and suggestions for authoring your Cosmos DB stored procedures. With this extension you are now able to browse Cosmos DB resources across both Document DB and Mongo DB interfaces along-side existing experiences for Azure Blobs, tables, files, and queues in ASE.

Key customer benefits

  • Enables ASE to be a one-stop shop to manage Azure database and storage resources
  • Provides an easy way to understand explorer experience for Cosmos DB data and structure
  • Offers flexible navigation experiences across Cosmos DB accounts and hierarchies
  • Delivers better accessibility for file navigation and data management capability with reliable performance

Summary of key features

  • Open Cosmos DB account in the Azure portal
  • Add resources to the Quick Access list
  • Search and refresh Cosmos DB resources
  • Connect directly to Cosmos DB through a connection string
  • Create and delete Databases
  • Create and delete Collections
  • Create, edit, delete, and filter Documents
  • Create, edit, and delete Stored Procedures, Triggers, and User-Defined Functions

document

Get started

Install Azure Storage Explorer: [Windows], [Mac], [Linux]

For more information about Cosmos DB in ASE, please see:

 

Learn more about today’s announcements on the Azure blog. Discover more Azure service updates.

If you have questions, feedback, comments, or bug reports, please use the comments below or send a note to cosmosdbtooling@microsoft.com.

Recap: EARL Boston 2017

$
0
0

By Emmanuel Awa, Francesca Lazzeri and Jaya Mathew, data scientists at Microsoft

A few of us got to attend EARL conference in Boston last week which brought together a group of talented users of R from academia and industry. The conference highlighted various Enterprise Applications of R. Despite being a small conference, the quality of the talks were great and showcased various innovative ways in using some of the newer packages available for use in the R language. Some of the attendees were veteran R users while some were new comers to the R language, so there was a mix in the level of proficiency in using the R language.  

R currently has a vibrant community of users and there are over 11,000 open source packages. The conference also encouraged women to join their local chapter for R Ladies with the aim of increasing the participation of women at R conferences and increasing the number of women who contribute R packages to the open source community.

The team from Microsoft got to showcase some of our tools namely the Microsoft ML Server and our commitment to support the open language R. Some of the Microsoft earned sessions were:

  1. Deep Learning with R – Francesca Lazzeri
  2. Enriching your Customer profile at Scale using R Server – Jaya Mathew, Emmanuel Awa & Robert Alexander
  3. Developing Deep Learning Applications with CNTK – Ali Zaidi

Microsoft was a sponsor at the event and had a booth at the conference where there was a live demo using the Cognitive Services APIs — namely the Face API — to detect age, gender, facial expression.

In addition, some of the other interesting talks were:

  1. When and Why to Use Shiny for Commercial Applications – Tanya Cashorali
  2. HR Analytics: Using Machine Learning to Predict Employee Turnover – Matt Dancho
  3. Using R to Automate the Classification of E-commerce Products – Aidan Boland
  4. Leveraging More Data using Data Fusion in R – Michael Conklin

All the slides from the conference will be available at the conference website shortly. For photos from the conference, visit EARL’s twitter page.


DevOps @ Connect(); 2017

$
0
0

I am excited…There will be lots of news…at Microsoft Connect(); 2017 happening Wednesday Nov 15th in New York City.

Please join the live stream starting at 10:00 AM EST for Scott Guthrie’s keynote, where he will showcase lots of new innovations across Azure, .NET, Visual Studio, Visual Studio Team Services and more. 

At 3:00 PM EST, I am going to double-click on the DevOps news and dive deeper into the latest updates to Team Foundation Server and Visual Studio Team Services. 

Click here to drop a reminder in your calendar.

Brian

 

Top stories from the VSTS community – 2017.11.10

$
0
0
Here are top stories we found in our streams this week related to DevOps, VSTS, TFS and other interesting topics.  TOP STORIES Defining a DevOps Culture – Esteban GarciaWhat do people mean when they tell you that you should have a DevOps culture to succeed with DevOps?  Isn’t DevOps just about the tools and automation?... Read More

Run and Debug Java 9 in Visual Studio Code

$
0
0

In the past 3 weeks, we’ve continued to see a lot of people installing and trying our tools, reading our documents, and visiting our repository. We’ve also seen a number of new issues opened by the Java community. Thank you all for trying our tools and providing feedback, all of which is motivating us to make VS Code a better tool for Java developer. With our new 0.3.0 release, we have added a few new capabilities and addressed your top feedback.

Java 9 Support

Java 9 was officially out just weeks after the initial release of our debugger. To allow more and more developers to try the new version of Java with VS Code, we’ve also updated the debugger to automatically detect modular project and then resolve module path as well as main class accordingly. You can now launch your Java 9 application with breakpoint in your source file with VS Code.

Java 9 Support

No Project? No Problem!

Even if you just have a single Java file which is not part of any project, now you can also run and debug it with VS Code. We believe it will come in handy when you are just learning Java or writing some quick utility tools.

Single File Debug

Configuring Workspace Folder and Environment Variables

Now you can also easily configure your workspace and set environment variables for your debuggee program easily in launch.json file.

cwd envVars Support

For other changes and bug fixes, you can find more details in our changelog.

Looking forward

As you can see from the issue list from our open sourced GitHub repository, we’re actively working on several key new features which will be available in follow up releases. Those features include Junit test cases support, multi-root workspace support, expression evaluation as well as hot code replacing. Please don’t hesitate if you would like to share your thoughts with us, just join the Gitter discussion or submit an issue!

Try it out

If you’re trying to find a performant editor for your Java project, please give it a try

Xiaokai He, Program Manager, Java Tools and Services
@XiaokaiHe

Xiaokai is a program manager working on Java tools and services. He’s currently focusing on making Visual Studio Code great for Java developers, as well as supporting Java in various of Azure services.

Developing AI applications on Azure: learning plans at three levels

$
0
0

If you're looking to expand your skills as an AI developer, or just getting started, these learning plans for AI Developers on Azure provide a wealth of information to get you up to speed. The beginner, intermediate and advanced tracks all provide step-by-step guides to setting up the tools and data in Azure, along with worked examples in iPython Notebooks.

  • The Beginner AI Developer Learning Plan provides an introduction to artificial intelligence and cognitive systems. It begins with an overview of Cognitive Services, and then works through several examples of using those APIs in applications: handwriting comprehension, speech comprehension, and face detection.
  • The Intermediate AI Developer Learning Plan walks through the process of creating an AI application that understands voice input. It begins with an overview of LUIS, the Language Understanding Intelligent Service, and walks through the process of defining intents, entities, and utterances, and labels. By the end of the session, you will have created a Python app that can understand commands you define and execute them when they are spoken.
  • The Advanced AI Developer Learning Plan focuses on vision-enabled applications. It begins with an introduction to computer vision and deep learning, and sets you up with a data science virtual machine for development. You will use CNTK to train a simple neural network on image data, and logistic regression to build a classifier. Then, you'll build a convolutional neural network and use transfer learning to implement object recognition in the app, and learn how to deploy the recognizer to a Raspberry Pi device.

Objects

Each of the learning paths will take 4-6 hours to work through. A Microsoft Azure subscription is required to implement various services in the plan (you can get an Azure free account if you don't already have one). You can find all three AI Developer Learning Plans at the link below.

Microsoft Azure Learning Paths: AI Developer on Azure

Because it’s Friday: Drone impact

$
0
0

On July 2 this year, a remotely-piloted drone wandered into active airspace ... twice. Not only does the video below illustrate the significant impact of this event, but it's also an interesting visualization of flight data.

That's all from us here at the blog for this week. We'll be back on Monday -- have a great weekend!

 

Viewing all 10804 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>