Quantcast
Channel: Category Name
Viewing all 10804 articles
Browse latest View live

Top Stories from the Microsoft DevOps Community – 2018.08.10

$
0
0
Time for my favorite part of the week: sitting back on a Friday afternoon and catching up on the hard work going on in the DevOps community. And this week didn’t disappoint, there were so many great stories that it was hard to pick just a few of my favorites. Creating a CI/CD pipeline with... Read More

Redmonk Language Rankings, June 2018

$
0
0

The latest Redmonk Language Rankings are out. These rankings are published twice a year, and the top three positions in the June 2018 rankings remain unchanged from last time: JavaScript at #1, Java at #2, and Python at #3. The Redmonk rankings are based on Github repositories (as a proxy for developer activity) and StackOverflow activity (as a proxy for user discussion), as shown in the chart below.

Redmonk Q318

Compared to two quarters ago, R has dropped 2 places to #14. But as Stephen O'Grady notes, this ranking volatility is typical:

R is essentially a case study at this point for why we recommend against over-reacting to any particular quarter’s performance. R has dropped two spots previously, only to bounce back a quarter or two later. Its domain specific nature means that it is never likely to compete for the very top spots on this ranking, but it remains dominant and broadly used within its area of expertise.

You can see the complete rankings list along with discussion of the languages covered in at the Redmonk blog linked below.

Tecosystems: The RedMonk Programming Language Rankings: June 2018

Because it’s Friday: A data center under the sea

$
0
0

Microsoft's Project Natick is investigating the feasibility of housing data centers (like those to power Azure) in submerged pressure vessels, with the goal of putting computational services near the demand, powered by renewable energy. That's awesome in its own right, but the Project Natick team also recently installed live underwater cams to view the pilot data center in the North Sea. Apparently, it attracts a lot of fish! I've been waiting to get a clear view of the data center for the screenshot below, but it's always swarmed by fish. Click on the image for a live camera -- you might get a better view than I did.

Many fish

That's all from the blog for this week. Have a great weekend, and we'll be back next week!

Building the Ultimate Developer PC 3.0 – The Parts List for my new computer, IronHeart

$
0
0

Ironheart is my new i9 PCIt's been 7 years since the last time I built "The Ultimate Developer PC 2.0," and over 11 since the original Ultimate Developer PC that Jeff Atwood built with for me. That last PC was $3000 and well, frankly, that's a heck of a lot of money. Now, I see a lot of you dropping $2k and $3k on MacBook Pros and Surfaces without apparently sweating it too much but I expect that much money to last a LONG TIME.

Do note that while my job does give me a laptop for work purposes every 3 years, my desktop is my own, paid for with my own money and not subsidized by my employer in any way. This PC is mine.

I wrote about money and The Programmer's Priorities in my post on Brain, Bytes, Back, and Buns. As Developer we spend a lot of time looking at monitors, sitting in chairs, using computers, and sleeping. It stands to reason we should should invest in good chairs, good monitors and PCs, and good beds. That also means good mice and keyboards, of course.

Was that US$3000 investment worth it? Absolutely. I worked on my PC2.0 nearly every day for 7 years. That's ~2500 days at about $1.25 a day if you consider upgradability.

Continuous PC Improvement via reasonably priced upgrades

How could I use the same PC for 7 years? Because it's modular.

  • Hard Drive - I upgraded 3 years back to a 512 gig Samsung 850 SSD and it's still a fantastic drive at only about $270 today. This kept my machine going and going FAST.
  • Video Card - I found a used NVidia 1070 on Craigslist for $250, but they are $380 new. A fantastic card that can do VR quite nicely, but for me, it ran three large monitors for years.
  • Monitors - I ran a 30" Dell as my main monitor that I bought used nearly 10 years ago. It does require a DisplayPort to Dual-Link DVI active adapter but it's still an amazing 2560x1600 monitor even today.
  • Memory - I started at 16 gigs and upgraded to 24 gigs when memory got cheaper.

All this adds up to me running the same first generation i7 processor up until 2018. And frankly, I probably could have gone another 3-5 years happily.

So why upgrade? I was gaming more and more as well as using my HTC Vive Pro and while the 1070 was great (although always room for improvement) I was pushing the original Processor pretty hard. On the development side, I have been running somewhat large distributed systems with Docker for Windows and Kubernetes, again, pushing memory and CPU pretty hard.

Ultimately however, price/performance for build-your-own PCs got to a reasonable place plus the ubiquity of 4k displays at reasonable costs made me think I could build a machine that would last me a minimum of 5 years, if not another 7.

Specifications

I bought my monitors from Dell directly and the PC parts from NewEgg.com. I named my machine IRONHEART after Marvel's Riri Williams.

  • Intel Core i9-7900X 10-Core 3.3 Ghz Desktop Processor - I like this processor for a few reasons. Yes, I'm an Intel fan, but I like that it has 44 PCI Express lanes (that's a lot) which means given I'm not running SLI with my video card, I'll have MORE than enough bandwidth for any peripherals I can throw at this machine. Additionally, it's caching situation is nuts. There's 640k L1, 10 MEGS L2, and 13.8 MEGS L3. 640 ought to be enough for anyone, right? ;) It's also got 20 logical processes plus Intel Turbo Boos Max that will move specific cores to 4.5GHz as needed, up from the base 3.3Ghz freq. It can also support up to 128 GB of RAM, although I'll start with 32gigs it's nice to have the room to grow.
  • 288-pin DDR4 3200Mhz (PC4 25600) Memory  4 x 8G - These also have a fun lighting effect, and since my case is clear why not bling it out a little?
  • Corsair Hydro Series H100i V2 Extreme Performance Water/Liquid CPU Cooler - My last PC had a heat sink you could see from space. It was massive and unruly. This Cooler/Fan combo mounts cleanly and then sits at the top of the case. It opens up a TON of room and looks fantastic. I really like everything Corsair does.
  • WD Black 512GB Performance SSD - M.2 2280 PCIe NVMe Solid State Drive - It's amazing how cheap great SSDs are and I felt it was time to take it to the next level and try M.2 drives. M.2 is the "next generation form factor" for drives and replaces mSATA. M.2 SSDs are tiny and fast. This drive can do as much as 2gigs a second as much as 3x the speed of a SATA SSD. And it's cheap.
  • CORSAIR Crystal 570X RGB Tempered Glass, Premium ATX Mid Tower Case, White - I flipping love this case. It's white and clear, but mostly clear. The side is just a piece of tempered glass. There's three RGB LED fans in the front (along with the two I added on the top from the cooler, and one more in the back) and they all are software controllable. The case also has USB ports on top which is great since it's sitting under my clear glass desk. It is very well thought out and includes many cable routing channels so your cables can be effectively invisible. Highly recommended.
    Clear white case The backside of the clear white corsair case
  • Corsair 120mm RGB LED Fans - Speaking of fans, I got this three pack bringing the total 120mm fan count to 6 (7 if you count the GPU fan that's usually off)
  • TWO Anker 10 Port 60W USB hubs. I have a Logitech Brio 4k camera, a Peavey PV6 USB Mixer, and a bunch of other USB3 devices like external hard drives, Xbox Wireless Adapter and the like so I got two of these fantastic Hubs and double-taped them to the desk above the case.
  • ASUS ROG GeForce GTX 1080 Ti 11 gig Video Card - This was arguably over the top but in this case I treated myself. First, I didn't want to ever (remember my 5 year goal) sweat video perf. I am/was very happy with my 1070 which is less than half the price, but as I've been getting more into VR, the NVidia 1070 can struggle a little. Additionally, I set the goal to drive 3 4k monitors at 60hz with zero issues, and I felt that the 1080 was the a solid choice.
  • THREE Dell Ultra HD 4k Monitors P2715Q 27" - My colleague Damian LOVES these monitors. They are an excellent balance in size and cost and are well-calibrated from the factory. They are a full 4k and support DisplayPort and HDMI 2.0.
    • Remember that my NVidia card has 2 DisplayPorts and 2 HDMI ports but I want to drive 3 monitors and 1 Vive Pro? I run the center Monitor off DisplayPort and the left and right off HDMI 2.0.
    • NOTE: The P2415Q and P2715Q both support HDMI 2.0 but it's not enabled from the factory. You'll need to enable HDMI 2.0 in the menus (read the support docs) and use a high-speed HDMI cable. Otherwise you'll get 4k at 30hz and that's really a horrible experience. You want 60hz for work at least.
    • NOTE: When running the P2715Q off DisplayPort from an NVidia card you might initially get an output color format of YCbCr 4:2:2 which will make the anti-aliased text have a colored haze while the HDMI 2.0 displays look great with RGB color output. You'll need to go into the menus of the display itself and set the Input Color Format to RGB *and* also into the NVidia display settings after turning the monitor and and off to get it to stick. Otherwise you'll find the NVidia Control Panel will reset to the less desirable YCbCr422 format causing one of your monitors to look different than the others.
    • Last note, sometimes Windows will say that a DisplayPort monitor is running at 59Hz. That's almost assuredly a lie. Believe your video card.
      Three monitors all running 4k 60hz

What about perf?

Developers develop, right? A nice .NET benchmark is to compile Orchard Core both "cold" and "warm." I use .NET Core 2.1 downloaded from http://www.dot.net

Orchard is a fully-featured CMS with 143 projects loaded into Visual Studio. MSBUILD and .NET Core in 2.1 support both parallel and incremental builds.

  • A warm build of Orchard Core on IRONHEART takes just under 10 seconds.
    • My Surface Pro 3 builds it warm in 62 seconds.
  • A totally cold build (after a dotnet clean) on IRONHEART takes 33.3 seconds.
    • My Surface Pro 3 builds it cold in 2.4 minutes.

Additionally, the CPUs in this case weren't working at full speed very long. This may be as fast as these 143 projects can be built. Note also that Visual Studio/MSBuild will use as many processors as it your system can handle. In this case it's using 20 procs.
MSBuild building Orchard Core across 20 logical processors

I can choose to constrain things if I think the parallelism is working against me, for example here I can try just with 4 processors. In my testing it doesn't appear that spreading the build across 20 processors is a problem. I tried just 10 (physical processors) and it builds in 12 seconds. With 20 processors (10 with hyperthreading, so 20 logical) it builds in 9.6 seconds so there's clearly a law of diminishing returns here.

dotnet build /maxcpucount:4

Building Orchard Core in 10 seconds

Regardless, My podcast site builds in less than 2 seconds on my new machine which makes me happy. I'm thrilled with this new machine and I hope it lasts me for many years.

Why don't you go get .NET Core 2.1 and clone Orchard Core from https://github.com/OrchardCMS/OrchardCore and run this in PowerShell

measure-command { dotnet build } 

and let me know in the comments how fast your PC is with both cold and warm builds!

NOTE: I  have an affiliate relationship with NewEgg and Amazon so if you use my links to buy something I'll make a small percentage and you're supporting this blog! Thanks!


Sponsor: Preview the latest JetBrains Rider with its built-in spell checking, initial Blazor support, partial C# 7.3 support, enhanced debugger, C# Interactive, and a redesigned Solution Explorer.


© 2018 Scott Hanselman. All rights reserved.
     

How to enhance HDInsight security with service endpoints

$
0
0

HDInsight enterprise customers work with some of the most sensitive data in the world. They want to be able to lock down access to this data at the networking layer as well. However, while service endpoints have been available in Azure data sources, HDInsight customers couldn’t leverage this additional layer of security for their big data pipelines due to the lack of interoperability between HDInsight and other data stores. As we have recently announced, HDInsight is now excited to support service endpoints for Azure Blob Storage, Azure SQL databases and Azure Cosmos DB.

With this enhanced level of security at the networking layer, customers can now lock down their big data storage accounts to their specified Virtual Networks (VNETs) and still use HDInsight clusters seamlessly to access and process that data.

In the rest of this post we will explore how to enable service endpoints and point out important HDInsight configurations for Azure Blob Storage, Azure SQL DB, and Azure CosmosDB.

Azure Blob Storage

When using Azure Blob Storage with HDInsight, you can configure selected VNETs on a blob storage firewall settings. This will ensure that only traffic from those subnets can access this storage account.

It is important to check the "Allow trusted Microsoft services to access this storage account." This will ensure that HDInsight service will have access to storage accounts and provision the cluster in a seamless manner.

blobStorageSE4

 

If the storage account is in a different subscription than the HDInsight cluster, please make sure that HDInsight resource provider is registered with the storage subscription. To learn more on how to register or re-register resource providers on a subscription, see additional resource providers and types. If HDInsight resource provider is not registered properly you might get this error message, which can be solved by registration of the resource provider.

NOTE: HDInsight cluster must be deployed into one of the subnets allowed in the blob storage firewall or the VNETs must be peered. This will ensure that the traffic from cluster VMs can reach the storage.

Azure SQL DB

If you are using an external SQL DB for Hive or Oozie metastore, you can configure service endpoints. “Allow access to Azure services” is not a required step from HDInsight point of view, since accessing these databases will happen after the cluster is created and the VMs are injected to the VNET.

sqlDBSE_thumb1

NOTE: HDInsight cluster must be deployed into one of the subnets allowed in the SQL DB firewall or the VNETs must be peered. This will ensure that the traffic from cluster VMs can reach the SQL DB.

Azure Cosmos DB

If you are using the spark connector for Azure Cosmos DB you can enable service endpoints in Cosmos DB firewall settings and seamlessly connect to it from HDInsight cluster.

cosmosDBSE_thumb

 

NOTE: HDInsight cluster must be deployed into one of the VNETs allowed in the Cosmos DB firewall or the VNETs must be peered. This will ensure that the traffic from cluster VMs can reach the SQL DB.

Try HDInsight now

We hope you take full advantage of today’s announcements and we are excited to see what you will build with Azure HDInsight. Read the developer guide and follow the quick start guide to learn more about implementing these pipelines and architectures on Azure HDInsight. Stay up-to-date on the latest Azure HDInsight news and features by following us on Twitter #HDInsight and @AzureHDInsight. For questions and feedback, please reach out to AskHDInsight@microsoft.com.

About HDInsight

Azure HDInsight is Microsoft’s premium managed offering for running open source workloads on Azure. Today, we are excited to announce several new capabilities across a wide range of OSS frameworks.

Azure HDInsight powers some of the top customer’s mission critical applications ranging in a wide variety of sectors including, manufacturing, retail education, nonprofit, government, healthcare, media, banking, telecommunication, insurance and many more industries ranging in use cases from ETL to Data Warehousing, from Machine Learning to IoT and many more.

Azure #CosmosDB reference solution: Visualizing real-time data analysis with change feed

$
0
0

This blog post was co-authored by Devki Trivedi, Serena Davis, and Pascal Habineza.

The Cosmos DB Explorer Interns are excited to release a reference solution that utilizes the Azure Cosmos DB Change Feed feature (+ Azure Functions, Azure Event Hubs, Azure Stream Analytics, Power BI). This solution features a hands-on lab that guides users through the entire pipeline: from uploading data onto Cosmos DB to implementing a Change Feed trigger to visualizing that data in a meaningful way on Power BI.

With this reference solution, businesses can:

  • Track sales and site activity in real time 
  • Manipulate data to see only the metrics useful to them
  • Choose specific time windows for data to be processed

The Cosmos DB Change Feed provides an automatic mechanism for getting a continuous and incremental feed of modified or inserted records from Cosmos DB. A few of its most popular use cases include:

  1. Triggering microservices and functions within an event-driven architecture
  2. Replicating data to secondary data stores
  3. Real-time data processing

This reference solution focuses on real-time data processing, specifically to fit the needs of e-commerce companies. For example, while an e-commerce company may currently track the most popular products purchased, it may want to improve its advertising and pricing strategies by identifying products that are commonly being viewed and added to customers' carts, but never purchased. The business may want to know its hourly, daily, and weekly revenue. Such metrics can help the business develop its advertising, inventory investment, and user experience strategies. Using this solution, businesses can customize these metrics to meet their needs and can observe them in real time using the Change Feed.
  
This reference solution models the above activities for interactivity and explains the process of deploying Azure resources. While it focuses on real-time data processing within the context of e-commerce, it remains applicable to many different domains.
  
The result of this solution deployment is a dashboard as pictured below that updates in real time. Users can analyze business performance using metrics customizable to their requirements on the dashboard.

image 
The pipeline below illustrates how the solution processes shoppers' data from data generation to real-time data visualization with Power BI – all using Microsoft Azure!

image As data is created by online shoppers, it is processed and written to Cosmos DB. These data events are stored for future use. As this is happening, the Change Feed begins to takes over. For every instance of interaction, when a new event is created and pushed into Cosmos DB, the Change Feed adds that event to its records. Whenever the Change Feed adds an event, it triggers an Azure Function. The function iterates through each inserted document and sends it into the Azure Event Hub through an Event Hub client. Next, Azure Stream Analytics is responsible for building meaningful metrics by querying the data from Event Hub and sending the results to Power BI for user visualization.

Watch Visualizing Real-Time Data Analysis with Cosmos DB Change Feed and check out our change feed lab to learn how your business can build an application to extract insightful metrics from your data in real time.

clip_image001

Thoughts? Please send us your feedback.

New customizations in Azure Migrate to support your cloud migration

$
0
0

You can accelerate your cloud migration using intelligent migration assessment services like Azure Migrate. Azure Migrate is a generally available service, offered at no additional charge, that helps you plan your migration to Azure. 

Azure Migrate discovers servers in your on-premises environment and assesses each discovered server’s readiness to run as an IaaS VM in Azure. In addition to Azure readiness, it helps you identify the right VM size in Azure after considering the utilization history of the on-premises VM. Azure Migrate also provides cost analysis for running the on-premises VMs in Azure. Additionally, if you have legacy applications, identifying the servers that constitute your application can become very complex. Azure Migrate helps you visualize dependencies of your on-premises VMs, so you can create high-confidence move groups and ensure that you are not leaving anything behind when you are moving to Azure.

Dependencies

Azure Migrate currently supports discovery and assessment of VMware virtualized Windows and Linux VMs; support for Hyper-V will be enabled in future.

When it comes to migration planning, every organization is unique and there is no one-size-fits-all solution. Each organization will have its own needs and in this blog post, I am going to talk about some of the new features added in Azure Migrate that can help you effectively plan your migration to cloud, based on your own needs.

Here is a summary of the features recently enabled in Azure Migrate allowing you to customize the assessments to meet your migration needs.

  • Reserved Instances (RI): Azure Migrate now allows you to plan migration to Azure with your virtual machines running in Azure VM Reserved Instances (RIs). Azure Migrate will model the cost advantages of RIs in the assessments. You can customize your assessment to include use of RIs by editing the assessment settings.
  • VM series: The Azure Migrate assessment provides a new control to specify the VM series to be used for size recommendations. Each VM series in Azure is designed for a certain kind of workload, for example, if you have a production workload which is IO intensive, you may not want to move the VMs to the A-series VMs in Azure, which are more suitable for entry level workloads. Using Azure Migrate, you can now apply your organization preferences and get recommendations based on your choices for the selected VM series.
  • Storage type: Many organizations want to move their VMs to VM sizes in Azure that provide a single instance VM SLA of > 99.9 percent. You can achieve the same if all disks attached to the VM are premium disks. If you have such SLA requirements, you can use the storage type property in an assessment and specify premium disks as your storage type to ensure that the VM size recommendations are done accordingly.
  • VM uptime: Not all workloads need to run 24x7 on the cloud. For example, if you have Dev-Test VMs that you only run 10 hours a day and only five days a week, you can now use the VM uptime property in assessment to specify the uptime details of the VMs and the cost estimation of running the VMs in Azure will be done accordingly.
  • Azure Government: If you are planning to migrate your workloads to the Azure Government cloud, you can now use Azure Migrate to plan your migrations for Azure Government as a target location. In addition to Azure Government, Azure Migrate already supports more than 30 different regions as assessment target location.
  • Windows Server 2008 (32-bit & 64-bit): If you have Windows Server 2008 VMs in your environment, you can now leverage the option to get free extended security updates by migrating these VMs to Azure. Azure Migrate will now help you identify such VMs as Azure ready VMs which can be migrated to Azure using Azure Site Recovery.

Properties

With the enhanced features, Azure Migrate provides you a lot of power and flexibility to customize your migration plan based on your own requirements. Learn more about how you can customize assessments using Azure Migrate.

Once you are ready with your migration plan, you can use services like Azure Site Recovery and Database Migration Service to migrate your workloads to Azure. Use the Azure migration center to learn when to use these services in your selected migration strategy.

We are continuously improving Azure Migrate based on the feedback that we have been receiving from all of you. If you have any ideas or see any gaps in Azure Migrate, we would love to hear it from you. Use our UserVoice forum to post your ideas and learn about the roadmap.

So, what are your waiting for, get started with Azure Migrate to begin your datacenter transformation journey!

New Lenovo ThinkPad P1 mobile workstation combines style and power

$
0
0

Lenovo has launched a brand-new addition to its ThinkPad mobile workstation portfolio—the ThinkPad P1. Lenovo’s thinnest, lightest, and sleekest mobile workstation, the ThinkPad P1 gives users the style they want and the performance they need.

Image of the Lenovo ThinkPad P1.

No company better combines high performance with beauty better than luxury sports car manufacturer, Aston Martin. For them, the ThinkPad P1 represents not only an opportunity to increase productivity, but to also work with a high quality and performing product within a fast-paced environment.

“At Aston Martin, we are committed to a high standard of excellence with a drive to push to the next level,” said Neil Jarvis, director of IT and Innovation at Aston Martin. “We see the expression of similar values in the creation of the new ThinkPad P1. Whether it’s our creative team benefiting from the professional graphics and stunning display, or our executive team looking for an ultra-thin and light system with a touch of style, the ThinkPad P1 is the right fit for a variety of users across our company.”

The ThinkPad P1 provides users with a premium experience—both in look and feel and superior construction and components. With a signature black finish, glass touchpad, and seamless keyboard, the ThinkPad P1 has the high-end design to deliver the ultimate out-of-box experience.

Image of a worker working in his office on a Lenovo ThinkPad.

This attention to detail goes even further with the ThinkPad P1 power supply. Customers know a bulky power supply runs counter to a thin and light workstation. To address this, Lenovo reduced the weight by 35 percent to ensure the ThinkPad P1 meets every requirement in the thin and light category.

Here are some other features of this mobile workstation:

  • Certified for key ISV applications, featuring 8th Gen Intel Xeon and Core processors, including support for the Core i9 CPU, ECC memory support, and speeds up to 4.6 GHz.
  • Boosted performance with the latest NVIDIA Quadro P1000 and P2000 professional graphics cards.
  • A 15-inch, 4K UHD display, representing 100 percent of the Adobe color gamut, as well as a touchscreen and IR camera standard.
  • 4 TB of M.2 PCIe premier storage and 64 GB of memory at 2667 MHz.

With more than 25 years of experience developing top commercial laptops, ThinkPad P1 design engineers chose the best features from the ThinkPad portfolio to maximize power, performance, and premium design from top to bottom.

“When we set out to create the ThinkPad P1, we knew our challenge was to build a mobile workstation that would carry the legacy of professional power and reliability of our ThinkPad portfolio, but also meet our customer’s need for a thin, light, and sleek design,” says Rob Herman, general manager of Workstations at Lenovo. “Whether you are looking for power, the lightest mobile workstation around, or sleek and slim tech-envy, the ThinkPad P1 delivers on all counts, period.”

Image of a Lenovo ThinkPad sitting open in a warehouse.

Lenovo is also introducing its new ThinkPad P72. Purpose-built for users looking for top-of-the-line power and performance, it is the ideal choice for users in the oil and gas, automotive, and financial industries. A true desktop replacement, its 17-inch chassis includes the latest 8th Gen Intel Xeon and Core processors, and the most powerful NVIDIA Quadro graphics—up to P5200—to tackle the most demanding workflows with ease. With up to 6 TB of storage, 128 GB of memory, and 16 GB of Intel Optane memory, users have vast amounts of compute power to handle immense data sets.

Lenovo is committed to understanding its customers’ evolving needs and daily demands of their workflows and workloads. With the introduction of the ThinkPad P1 and the ThinkPad P72, Lenovo is addressing two very distinct and unique groups of customer needs.

Both will be available at the end of August 2018. The ThinkPad P1 starts at $1,949 and the ThinkPad P72 starts at $1,799. Learn more about the ThinkPad P1 and the ThinkPad P72.

The post New Lenovo ThinkPad P1 mobile workstation combines style and power appeared first on Microsoft 365 Blog.


Using MSVC in a Docker Container for Your C++ Projects

$
0
0

Containers encapsulate the runtime environment of an application: the file system, environment settings, and virtualized OS are bundled into a package. Docker containers have changed the way we think about build and test environments since they were introduced five years ago.

Visual Studio’s setup and install expert, Heath Stewart, blogs regularly about how to install the Visual Studio Build Tools in a Windows Docker Container. Recently he explained why you won’t find a container image for build tools. Basically, any install that has a workload installed that you aren’t using is going to be bigger than you want. You should tailor an install for your needs. So here we are going to show how to create and use an MSVC Docker container.

The first step is to install Docker Community Edition. If you don’t have Hyper-V enabled, the installer will prompt you to enable it and reboot. After it is installed you need to switch to . The official documentation on how to Install Build Tools into a Container walks through this in much more detail. If you refer to that, stop at step 5 and read the following instructions for creating an MSVC Docker container.

Remember that the VS Build Tools are licensed as a supplement to your existing Visual Studio license. Any images built with these tools should be for your personal use or for use in your organization in accordance with your existing Visual Studio and Windows licenses. Please don’t share these images on a public Docker hub.

Visual Studio Build Tools Dockerfiles

To help you get started in creating Dockerfiles tailored for your needs we have VS Build Tools Dockerfile samples available. Clone that repository locally. There you will find a directory for native-desktop that contains the files we will use as our starting point. If you open that Dockerfile you will see the following line within.

--add Microsoft.VisualStudio.Workload.VCTools --includeRecommended 

That line adds the VCTools workload and additional recommended components that today include CMake and the current Windows SDK. This is a smaller installation than our other sample that includes support for building managed and native projects. To learn more about how you can find the components you need see our additional workloads and components documentation.

Open PowerShell in the native-desktop directory or this repo and build the Docker image.

docker build -t buildtools2017native:latest -m 2GB .

The first time you build the image it will pull down the windowsservercore image, download the build tools, and install them into a local container image.

When this is done you can see the images available locally.

docker images
REPOSITORY                 TAG                            IMAGE ID       CREATED         SIZE
Buildtools2017native       latest                         4ff1cd971254   3 minutes ago   14.9GB
Buildtools2017             latest                         325048ba2240   4 hours ago     22.4GB
microsoft/dotnet-framework 3.5-sdk-windowsservercore-1709 7d89a4baf66c   3 weeks ago     13.2GB

While our buildtoolsmsvc image is large, it is significantly smaller than the VS Build Tools image that includes managed support. Note that these image sizes include the base image so they can be deceiving in terms of actual disk space used.

Trying it all out

Let’s test out our build tools container using a simple little program. Create a new Windows Console Application project in Visual Studio. Add iostream and write out a hello world message as so.

#include "stdafx.h"
#include <iostream>
using namespace std;
int main()
{
  cout << "I was built in a container..." << endl;
}

We are going to want to deploy this in a future post to a Windows Nano Server container so change the platform target to x64. Go to your project properties and under C/C++ change the Debug Information Format to C7 compatible.

Now we are going to create a build container for our project. Note this line in the Dockerfile we used to create our buildtoolsmsvc image.

ENTRYPOINT C:BuildToolsCommon7ToolsVsDevCmd.bat &&

The && in that line allows us to pass parameters in when we create a container using this image. Here we will use that to pass in the msbuild command with the options necessary to build our solution. When we start our container for the first time we will map our local source into a directory in the container using the -v option. We will also give the container a name using –name to remember what it is for later. That also makes it easier to run again as we update our source code.

docker run -v C:sourceConsoleApplication1:c:ConsoleApplication1 --name ConsoleApplication1 buildtoolsmsvc msbuild c:ConsoleApplication1ConsoleApplication1.sln /p:Configuration=Debug /p:Platform=x64

This will spin up a container, use msbuild to build our solution, then exit, and stop the container. You should see output from this process as it runs. Since we mapped to our local volume you will find the build output under your solution directory under x64Debug. You can run ConsoleApplication1.exe from there and you’ll see the output

I was built in a container...

As you change your sources you can run the container again by running this command (omit the -a attach option if you don’t need to see the output from within the container).

docker start -a ConsoleApplication1

Conclusion

This has gotten us a working MSVC toolset in an isolated, easy-to-use Docker container. You can now spin up additional containers for different projects using the approach outlined above. The disk space used by additional containers using the same base image is miniscule compared to spinning up VMs to get isolated build environments.

In a future post we’ll cover how to take the build output from this process, run it in a container, then attach to the process in the container and debug it.

If you are using containers or doing cloud development with C++ we would love to hear from you. We wouldd appreciate it if you could take a few minutes to take our C++ cloud and container development survey. This will help us focus on topics that are important to you on the blog and in the form of product improvements.

Azure Marketplace new offers: July 1–15

$
0
0

We continue to expand the Azure Marketplace ecosystem. From July 1 to 15, 97 new offers successfully met the onboarding criteria and went live. See details of the new offers below:

Virtual Machines

Bloombase StoreSafe

Bloombase StoreSafe: Bloombase StoreSafe delivers agentless, non-disruptive, application-transparent encryption security of data at-rest in enterprise storage systems, enabling customers to lock down their crown jewels and achieve regulatory compliance requirements.

BlueCat DNS Edge Service Point VM for Azure

BlueCat DNS Edge Service Point VM for Azure: A virtual service point that extends DNS Edge capabilities into Azure environments, Edge adds a much-needed layer of visibility, control, and detection for corporate networks under siege from malware attacks that exploit DNS.

Centos 7 webserver LAMP with WAF

Centos 7 webserver (LAMP) with WAF: Secure your system with Centos LAMP server with Web Application Firewall, and use the Rimau WAF Panel to manage your security configuration and rules.

Dell EMC Data Domain Virtual Edition - DD VE v4.0

Dell EMC Data Domain Virtual Edition (DD VE) v4.0: Dell EMC Data Domain Virtual Edition (DD VE) is a software-defined version of Dell EMC Data Domain. It can be downloaded, deployed, and configured in minutes on any server, converged or hyper-converged.

Dell EMC NetWorker Virtual Edition

Dell EMC NetWorker Virtual Edition: NetWorker Virtual Edition (NVE) offers protection for enterprise applications and databases. NVE extends a robust feature set, deduplication integration, and simplified management of NetWorker software for Azure cloud.

Eastwind Sensor for CloudVu

Eastwind Sensor for CloudVu: Get a more complete picture of your cyber terrain with Azure Network Traffic analysis, providing visibility, threat analysis, and user and entity behavioral analytics to identify malicious activity, insider threats, and data leakage.

Hardened Node js PHP IIS on Windows Server 2012 R2

Hardened Node.js PHP IIS on Windows Server 2012 R2: Node.js uses an event-driven, non-blocking I/O model that makes it lightweight and efficient, perfect for data-intensive real-time applications that run across distributed devices.

ISP OCR on Azure

ISP OCR on Azure: This optical character recognition solution analyzes forms and images. Use it for receipt OCR analysis, including receipts longer than 25 centimeters, or for analysis of a driver's license, with the ability to mask sensitive information.

LimeSurvey by tunnelbiz

LimeSurvey by tunnelbiz: Limesurvey is an open-source survey software with advanced features like branching and multiple question types, making it a valuable tool for survey creation.

Netsweeper 6 0 6

Netsweeper 6.0.6: Manage your network effectively and efficiently with carrier-grade web content filtering.

php linux server with laravel framework

php linux server with laravel framework: Laravel is a free, open-source PHP web framework intended for the development of web applications following the model–view–controller (MVC) architectural pattern and based on Symfony.

QRadar SIEM BYOL

QRadar SIEM (BYOL): QRadar’s approach to security analytics chains together related events to provide security teams with a single alert on each potential incident. This advanced correlation reduces alert fatigue and streamlines attack detection.

Softadmin - Low code business automation platform

Softadmin® – Low code business automation platform: Get started with Softadmin, the low-code platform for efficiency and automation regardless of your business type or size. The components provide professional standardization and high scalability.

Zultys MXvirtual from Connector73

Zultys MXvirtual from Connector73: This unified communication solution and IP phone system in an Azure virtual machine integrates voice, video, data, and mobility to optimize collaboration and communication for businesses of all sizes.

Web Applications

AcuPebble

AcuPebble: Acurable's clinical research biosignals monitoring sensor is a wearable medical device that records respiratory and cardiac acoustic signals to help diagnose and treat sleep apnea, epilepsy, asthma, and COPD.

CSP Control Center

CSP Control Center: CSP Control Center (C3) is a cloud platform built for Microsoft CSP partners, enabling them to distribute, sell, bill, and provision cloud solutions. The platform supports not only the direct distributors but also the indirect resellers.

Kaspersky Hybrid Cloud Security BYOL

Kaspersky Hybrid Cloud Security (BYOL): Kaspersky Hybrid Cloud Security enables a seamlessly orchestrated and adaptive cybersecurity ecosystem. Wherever you process and store critical business data, it protects your workloads against the most advanced threats.

RadiantOne Federated Identity Service

RadiantOne Federated Identity Service: Ease the move to Azure Active Directory for companies and organizations with many distributed identity sources (AD domains and forests, LDAP, SQL, and APIs). Integrate and sync your ID data sources and Azure AD.

Consulting Services

2-DAY Azure Cloud Transformation Workshop and POC

2-DAY Azure Cloud Transformation Workshop & POC: Microsoft Azure is a powerful platform, but getting started can be daunting. Begin your Azure cloud journey with Steeves and Associates to gain the tech know-how to move forward with Azure.

Advanced Analytics and AI - half-day Briefing

Advanced Analytics and AI: 1/2 Day Briefing: During this half-day workshop, Dimensional Strategies Inc. will provide powerful insights on machine learning and how industries can extract value from their data by leveraging Microsoft's machine learning tools.

AI Envisioning - 1-Day Workshop

AI Envisioning: 1-Day Workshop: Guided by T4G experts, this collaborative envisioning engagement is designed to harness the power of artificial intelligence to amplify your business strategy.

AI Quick Start - 4-Week POC

AI Quick Start: 4-Week POC: This proof of concept facilitated by T4G is an effective way for your business to explore the value of artificial intelligence. This four-week engagement is a guided exploration of select use cases using client data in a private and secure environment.

App Modernization Smart Start - 2-Hr Briefing

App Modernization “Smart Start": 2-Hr Briefing: Separate fact from fiction, establish project direction, and create a foundation for success with a Smart Start introductory session (up to two hours) led by senior Azure architects. A custom agenda will match your cloud objectives.

Application Development - 3-Day Assessment

Application Development: 3-Day Assessment: Teksouth will review your current environment, gather requirements, and provide a cost estimate and a timeline to move forward. This offering applies to Azure, .NET, and SQL Server-based solutions in Azure, on premises, or hybrid.

Azure AD Features - 5-Day Implementation

Azure AD Features: 5-Day Implementation: Do you own Azure AD Premium but don't use its features? We deploy self-service password reset and multi-factor authentication (MFA) in under a week.

Azure Backup Managed Service

Azure Backup Managed Service: Betach will help to create or modify your backup strategy, and offer services to back up your server states and the data that resides on them. Betach provides peace of mind knowing that your infrastructure's backup is ready if needed.

Azure Cloud Transformation 1-Day Workshop

Azure Cloud Transformation 1-Day Workshop: Steeves and Associates works with you to understand and assesses your current problems and limitations in order to determine an appropriate cloud roadmap and identify "quick wins" and opportunities for innovation.

Azure Custom App Modernization - 2-Day Workshop

Azure Custom App Modernization: 2-Day Workshop: Softlanding’s two-day engagement will cover the benefits of Azure and guide you through the process of reviewing your existing application and mapping it to Azure Platform-as-a-Service.

Azure Data Jumpstart - 2-Wk PoC

Azure Data Jumpstart: 2-Wk PoC: Accelerate the migration of your data platform to Azure. The Coeo Azure Data Jumpstart helps customers assess their current estate, improve agility, and create a plan for modernization as they move to Microsoft Azure.

Azure Data Optimization - 2-Wk Workshop

Azure Data Optimization: 2-Wk Workshop: Optimize your Azure data services solution with this two-week jumpstart offering by Coeo, which offers design assurance, cost effectiveness, and performance optimization.

Azure Dev-Test FastStart - 2-Week Implementation

Azure Dev-Test FastStart: 2-Week Implementation: Softlanding will guide you through the process of reviewing your existing virtual machines and systems and will design a proper dev/test environment in Azure with proper governance in place.

Azure Disaster Recovery FastStart - 10-day Workshop

Azure Disaster Recovery FastStart: 10-day Workshop: Softlanding can help you extend your major IT systems to Microsoft Azure for simple, automated protection. Create, configure, and maintain your business-critical apps within Softlanding’s Azure Disaster Recovery FastStart.

Azure Health Check - 5-Day Assessment

Azure Health Check: 5-Day Assessment: Interested in a healthier Azure environment and reduced costs? The Vigilant.IT Azure Health Check identifies what's unhealthy, helps your staff understand the issues, and identifies how you can save.

Azure Identity Management Jumpstart - 2-Wk PoC

Azure Identity Management Jumpstart: 2-Wk PoC: Optimize your cloud user identity strategy with this two-week proof of concept by Coeo. You will receive a documented review of your security strategy including a high-level roadmap and prioritized next steps.

Azure Server Migration - 4-Day Proof of Concept

Azure Server Migration: 4-Day Proof of Concept: This four-day proof of concept can help you migrate servers to Azure based on best practices. It is for executives and IT managers/decision-makers, and is primarily held on-site at the client's facility.

Azure SQL Managed Instance - 3 Day Proof of Concept

Azure SQL Managed Instance: 3 Day Proof of Concept: Let Dimensional Strategies Inc. show you how easy it is to move your on-premises SQL Server databases to the cloud with Azure SQL Database Managed Instance.

Azure Test Dev Environment - 5-Day Workshop

Azure Test/Dev Environment: 5-Day Workshop: Create a cloud-based lab so your organization can develop and test your application while significantly reducing delivery time and costs.

Big Data Architecture - 3-Wk Assessment

Big Data Architecture: 3-Wk Assessment: Alithya's assessment is designed to help identify which Big Data and Azure technologies are most appropriate based on the volume, velocity, and variety of the data sources under consideration.

Big Data Launchpad - 6-Wk Implementation

Big Data Launchpad: 6-Wk Implementation: Use Azure Data Lake and HDInsight to maximize the value of your data with this launchpad offering by Coeo. Envision success with big data, identify key data assets, and gain a roadmap to production deployment.

Blockchain for Enterprise - 1-Day Workshop

Blockchain for Enterprise: 1-Day Workshop: This instructor-led classroom training is for non-technical and business development experts, who will learn how to build industry-tailored solutions using a wide range of blockchain frameworks.

Carbonite Endpoint Protection - Azure 500 Bundle

Carbonite Endpoint Protection - Azure 500 Bundle: Carbonite Professional Services will deploy our PaaS offering onto a customer's Azure EA, which enables the ability to back up endpoints to Azure. Customer receives licensing rights to protect up to 500 devices for 24 months.

Carbonite Migrate - SQL 1TB 4-Wk Implementation

Carbonite Migrate - SQL 1TB, 4-Wk Implementation: Carbonite Professional Services will ensure that all aspects of your SQL migration project are a success, from the initial scoping of the project to the ultimate cut-over of your servers.

Carbonite Migrate - SQL 1TB 4-Wk Implementation

Carbonite Migrate - SQL 5TB, 4-Wk Implementation: Carbonite Professional Services will ensure that all aspects of your SQL migration project are a success, from the initial scoping of the project to the ultimate cut-over of your servers.

Carbonite Migrate - SQL 1TB 4-Wk Implementation

Carbonite Migrate - SQL 20TB, 4-Wk Implementation: Carbonite Professional Services will ensure that all aspects of your SQL migration project are a success, from the initial scoping of the project to the ultimate cut-over of your servers.

Carbonite Migrate - SQL 1TB 4-Wk Implementation

Carbonite Migrate - SQL 50TB, 4-Wk Implementation: Carbonite Professional Services will ensure that all aspects of your SQL migration project are a success, from the initial scoping of the project to the ultimate cut-over of your servers.

Carbonite Migrate - SQL 1TB 4-Wk Implementation

Carbonite Migrate- Linux 1TB, 4-Wk Implementation: Carbonite Professional Services will ensure that all aspects of your Linux migration project are a success, from the initial scoping of the project to the ultimate cut-over of your servers.

Carbonite Migrate - SQL 1TB 4-Wk Implementation

Carbonite Migrate- Linux 5TB, 4-Wk Implementation: Carbonite Professional Services will ensure that all aspects of your Linux migration project are a success, from the initial scoping of the project to the ultimate cut-over of your servers.

Carbonite Migrate - SQL 1TB 4-Wk Implementation

Carbonite Migrate- Linux 20TB, 4-Wk Implementation: Carbonite Professional Services will ensure that all aspects of your Linux migration project are a success, from the initial scoping of the project to the ultimate cut-over of your servers.

Carbonite Migrate - SQL 1TB 4-Wk Implementation

Carbonite Migrate- Linux 50TB, 4-Wk Implementation: Carbonite Professional Services will ensure that all aspects of your Linux migration project are a success, from the initial scoping of the project to the ultimate cut-over of your servers.

Carbonite Migrate - SQL 1TB 4-Wk Implementation

Carbonite Migrate-Windows 1TB, 4-Wk Implementation: Carbonite Professional Services will ensure that all aspects of your Windows migration project are a success, from the initial scoping of the project to the ultimate cut-over of your servers.

Carbonite Migrate - SQL 1TB 4-Wk Implementation

Carbonite Migrate-Windows 5TB, 4-Wk Implementation: Carbonite Professional Services will ensure that all aspects of your Windows migration project are a success, from the initial scoping of the project to the ultimate cut-over of your servers.

Carbonite Migrate - SQL 1TB 4-Wk Implementation

Carbonite Migrate-Windows 20TB, 4Wk Implementation: Carbonite Professional Services will ensure that all aspects of your Windows migration project are a success, from the initial scoping of the project to the ultimate cut-over of your servers.

Carbonite Migrate - SQL 1TB 4-Wk Implementation

Carbonite Migrate-Windows 50TB, 4Wk Implementation: Carbonite Professional Services will ensure that all aspects of your Windows migration project are a success, from the initial scoping of the project to the ultimate cut-over of your servers.

Citrix On Azure - 6-Day Proof of Concept

Citrix On Azure: 6-Day Proof of Concept: This six-day proof of concept will help you seamlessly migrate your legacy Citrix workloads to Microsoft Azure. It is for executives, IT managers, and decision-makers, and is primarily held on-site at the client’s facility.

Cloud Organisational Readiness - 5-Day Assessment

Cloud Organisational Readiness: 5-Day Assessment: BJSS' 120-point assessment brings together stakeholders from across the organization to review the platforms, processes, and resources required for successful transition to the cloud.

Cloud Platform Framework - 2-Day Assessment

Cloud Platform Framework: 2-Day Assessment: AMTRA’s Platform Adoption Framework for Microsoft Azure provides organizations the tools and resources to develop a pragmatic, prescriptive, and actionable cloud adoption roadmap.

Cloud Readiness Assessment

Cloud Readiness Assessment: CNI’s assessment is a programmatic approach to reduce cloud and general IT operating costs, provide better services to stakeholders in a more agile manner, and provide recommendations on all security and governance issues.

Cloud Standalone - 4-Wk PoC

Cloud Standalone: 4-Wk PoC: This proof of concept by Diaxion will cover Azure tenancy and networking requirements, subscription model planning, information security and identity management planning, billing, storage, and more.

Cloudify for ISVs - 3 Day Implementation

Cloudify for ISV's: 3 Day Implementation: Cloudify enables any classic application to be delivered as a Software-as-a-Service on Azure in three business days. Your classic application will be able to use Azure Active Directory for authentication and single sign-on.

Coeo Architecture - 2-Day Workshop

Coeo Architecture: 2-Day Workshop: Find a data platform strategy that is cost-effective and operationally appropriate for your business. Coeo’s architecture workshop offers personalized and experience-based guidance to help you transform confidently and effectively.

Coeo Architecture - 2-Day Workshop

Data Platform Healthcheck: 2-Wk Workshop: Solve SQL Server data platform performance problems, provide operational stability, and improve security with Coeo’s thorough review. Ensure your databases and their surrounding environment are running optimally.

Coeo Architecture - 2-Day Workshop

Data Platform License Optimization: 2-Wk Workshop: Modernize and optimize your Microsoft Data Platform licensing with this workshop by Coeo. Coeo will provide a high-level solution design that supports growth, providing the right level of performance and availability.

Coeo Architecture - 2-Day Workshop

Data Security Architecture Workshop: 2-Wk Workshop: This workshop helps you secure and protect sensitive data while maintaining compliance. Coeo will conduct a maturity assessment and create a roadmap to implement security measures to protect your data assets.

Disaster Recovery 1-Week Implementation

Disaster Recovery 1-Week Implementation: Betach DRaaS uses Microsoft Azure to protect applications and data from disruption and a total system backup in the event of system failure. This implementation includes assessment, roadmap, and engagement to implement DRaaS.

Enabling and Securing Modern Workplace - 2-Day POC

Enabling & Securing Modern Workplace: 2-Day POC: Technical leaders and decision-makers will work with SyCom's engineers to demonstrate such productivity and security solutions as Azure Active Directory Premium, Azure Information Protection, and Advanced Threat Analytics.

ExpressRoute - 5-Day Implementation

ExpressRoute: 5-Day Implementation: The ExpressRoute deployment solution provides a dedicated connection to Azure that addresses such network challenges as congested Internet links resulting in slow Azure performance and latency issues leading to unresponsive services.

Healthcare Cloud Security Stack for Azure - 12 Mo

Healthcare Cloud Security Stack for Azure: 12 Mo: Healthcare Cloud Security Stack for Azure uses the Qualys Cloud Agent, Trend Micro Deep Security, and XentIT Executive Dashboard as a unified cloud threat management solution.

Healthcare Cloud Security Stack for Azure - 12 Mo

Healthcare Cloud Security Stack for Azure: 30 Days: Get a 30-day implementation of Healthcare Cloud Security Stack for Microsoft Azure, including Qualys Vulnerability Management and Cloud Agents, Trend Micro™ Deep Security™, and unified dashboard by XentIT.

ExpressRoute - 5-Day Implementation

Hybrid File Storage: 5-Day Implementation: Interested in lowering file storage costs by up to 60 percent without sacrificing performance? Our solution provides a fast local cache for frequently accessed files and low-cost Azure storage for the rest.

Intelligent Apps Discovery - 1-Day Workshop

Intelligent Apps Discovery: 1-Day Workshop: Guided by T4G experts, this collaborative engagement will assess your readiness, surface new ideas, and reveal how intelligent software can transform your organization.

Intelligent Apps Discovery - 1-Day Workshop

Intelligent Apps Discovery-Design: 4-Week Workshop: This collaborative engagement will assess your readiness and surface new ideas, then we'll design, test, and plan how intelligent apps can transform your organization.

Coeo Architecture - 2-Day Workshop

Introduction to Machine Learning: 1 Day Wksp: Coeo’s Introduction to Machine Learning is a one-day classroom training course that will give attendees the knowledge they need to drive forward data insights in their company.

Machine Learning Suitability - 3-Wk Assessment

Machine Learning Suitability: 3-Wk Assessment: Alithya's machine learning suitability assessment identifies if machine learning is the right approach for your business problem and provides recommendations for the most beneficial next steps.

Enabling and Securing Modern Workplace - 2-Day POC

Microsoft Security Trifecta: 4-Day POC: Technical leaders and decision-makers will work with engineers from SyCom Technologies to demonstrate Windows Defender Advanced Threat Protection, Azure Advanced Threat Protection, and Office 365 Advanced Threat Protection.

Migrate to Azure Stack - Windows 5Tb 4-Wks

Migrate to Azure Stack - Windows 5Tb (4-Wks): The Carbonite Professional Services team will ensure that all aspects of your Windows migration project are a success, from the initial scoping of the project to the ultimate cut-over of your servers.

Migrate to Azure Stack - Windows 5Tb 4-Wks

Migrate to Azure Stack - Windows 20Tb (4-Wks): The Carbonite Professional Services team will ensure that all aspects of your Windows migration project are a success, from the initial scoping of the project to the ultimate cut-over of your servers.

Migrate to Azure Stack - Windows 5Tb 4-Wks

Migrate to Azure Stack - Windows 50TB (4-Wks): The Carbonite Professional Services team will ensure that all aspects of your Windows migration project are a success, from the initial scoping of the project to the ultimate cut-over of your servers.

Migrate to Azure Stack - Windows 5Tb 4-Wks

Migrate to Azure Stack- Linux 1TB (4-Wks): The Carbonite Professional Services team will ensure that all aspects of your Linux migration project are a success, from the initial scoping of the project to the ultimate cut-over of your servers.

Migrate to Azure Stack - Windows 5Tb 4-Wks

Migrate to Azure Stack- Linux 5TB (4-Wks): The Carbonite Professional Services team will ensure that all aspects of your Linux migration project are a success, from the initial scoping of the project to the ultimate cut-over of your servers.

Migrate to Azure Stack - Windows 5Tb 4-Wks

Migrate to Azure Stack- Linux 20TB (4-Wks): The Carbonite Professional Services team will ensure that all aspects of your Linux migration project are a success, from the initial scoping of the project to the ultimate cut-over of your servers.

Migrate to Azure Stack - Windows 5Tb 4-Wks

Migrate to Azure Stack- Linux 50TB (4-Wks): The Carbonite Professional Services team will ensure that all aspects of your Linux migration project are a success, from the initial scoping of the project to the ultimate cut-over of your servers.

Migrate to Azure Stack - Windows 5Tb 4-Wks

Migrate to Azure Stack-SQL 1TB, 4Wk Implementation: The Carbonite Professional Services team will ensure that all aspects of your SQL migration project are a success, from the initial scoping of the project to the ultimate cut-over of your servers.

Migrate to Azure Stack - Windows 5Tb 4-Wks

Migrate to Azure Stack-SQL 5TB, 4Wk Implementation: The Carbonite Professional Services team will ensure that all aspects of your SQL migration project are a success, from the initial scoping of the project to the ultimate cut-over of your servers.

Migrate to Azure Stack - Windows 5Tb 4-Wks

Migrate to Azure Stack-SQL 20TB, 4-Wks: The Carbonite Professional Services team will ensure that all aspects of your SQL migration project are a success, from the initial scoping of the project to the ultimate cut-over of your servers.

Migrate to Azure Stack - Windows 5Tb 4-Wks

Migrate to Azure Stack-SQL 50TB, (4Wks): The Carbonite Professional Services team will ensure that all aspects of your SQL migration project are a success, from the initial scoping of the project to the ultimate cut-over of your servers.

Azure Identity Management Jumpstart - 2-Wk PoC

Migrating to MSFT Azure Data Services: 1 Day Wksp: Coeo’s Migrating to Microsoft Azure Data Services is a one-day introductory classroom training course that will teach attendees how to store data securely in Microsoft Azure so that it can be retrieved quickly and efficiently.

Advanced Analytics and AI - half-day Briefing

Modern Analytics Platform: 1/2 Day Briefing: Data-driven organizations need a modern analytics platform that empowers information workers and satisfies the needs of end users. Explore enterprise adoption and integration of new data sources and business logic.

Big Data Architecture - 3-Wk Assessment

Prime Brokerage Reporting: 3-Wk Assessment: Alithya's assessment helps banks to determine the appropriate architecture to fulfill Prime Brokerage Services needs. The client will receive a report identifying next steps and a roadmap to a unified customer experience.

Cloud Readiness Assessment

Production Ready Cloud Platform: CNI Production Ready Cloud Platform provides a smart and quick route to the cloud. The CNI Production Ready Cloud Platform addresses networking, security, design architecture, and ongoing management.

   

Rapid Commerce - A Microservices Commerce Platform

Rapid Commerce: A Microservices Commerce Platform: Shift your retail business into high gear with a cloud-based, microservices accelerator and seven-week pilot by Publicis.Sapient. The Rapid Commerce solution is an out-of-the-box toolkit for large retailers.

RCG - enable Credit Analytics

RCG|enable® Credit Analytics: This Credit Analytics solution can be used to manage risk of loan portfolios, maximize risk-adjusted rate of return, minimize capital reserve requirements, and strengthen compliance. A 30-day trial of Credit Analytics is offered through 2018.

SAP HANA Transformation - 3-Wk Proof-of-Concept

SAP HANA Transformation: 3-Wk Proof-of-Concept: This proof of concept for SAP HANA migration to Azure can start your digital transformation journey. Get help planning, designing, executing and validating this migration to leverage HANA and S/4HANA for your business.

SAP Jumpstart Package - 10-Wk Implementation

SAP Jumpstart Package: 10-Wk Implementation: Wish your SAP content wasn’t in silos and was more accessible, cost-effective, and manageable? If so, you should consider Document Management for SAP.

SAP on Azure FREE 1-Day Cloud Readiness Workshop

SAP on Azure FREE 1-Day Cloud Readiness Workshop: This complimentary workshop from Qnovate is focused on gathering data points to provide a recommended approach (with fixed cost and time) to migrating an existing or new SAP infrastructure onto the Azure Cloud.

SAP on Azure FREE 1-Day Cloud Readiness Workshop

SAP on Azure FREE Migration Assessment & Analysis: Qnovate will provide a one-day assessment of the return on investment for moving your current on-premises SAP applications onto the Azure Cloud.

Azure Identity Management Jumpstart - 2-Wk PoC

SQL Server Internals & Troubleshooting: 2 Day Wksp: This level 300 workshop covers SQL Server internals knowledge, practical troubleshooting skills, and a proven approach to problem solving. Attendees will be able to tackle complex SQL Server problems with confidence.

Team Foundation Server -TFS- to VSTS Migration

Team Foundation Server (TFS) to VSTS Migration: Create the best dev environment for your teams. We’ll help you migrate from Team Foundation Server (TFS) to Visual Studio Team Services (VSTS) in three easy steps and five days of embedded support.

Azure.Source – Volume 44

$
0
0

Now in preview

Announcing the public preview of Windows Container Support in Azure App Service - Windows Server Containers in Web App are now available in public preview. Azure App Service provides pre-defined application stacks on Windows like ASP.NET or Node.js, running on IIS. The preconfigured Windows environment locks down the operating system from administrative access, software installations, changes to the global assembly cache, and so on. If your application requires more access than the preconfigured environment allows, you can deploy a custom Windows container instead. Azure App Service caches several parent images to reduce start-up time.

Screenshot showing Windows Containers in Azure App Service pricing tiers

New Azure #CosmosDB JavaScript SDK 2.0 now in public preview - Version 2.0 RC of the JavaScript SDK for SQL API is now available in public preview, which is also open source and available via NPM. For the SQL API, we support a JavaScript SDK to enable development against Azure Cosmos DB from JavaScript and Node.js projects. Version 2.0 of the SDK is written completely in TypeScript, and we’ve redesigned the object model and added support for promises.

Now generally available

Announcing general availability of Azure SQL Database reserved capacity - Azure SQL Database reserved capacity is generally available and is now available for single and elastic pool databases. Customers with active Software Assurance can save up to 55 percent using Azure Hybrid Benefit for SQL Server with the new vCore-based purchasing model in SQL Database. With support for reserved capacity on single databases and elastic pools, you can unlock even more savings when you combine your Azure Hybrid Benefit with reserved capacity pricing to save even more. SQL Database reserved capacity pricing is scoped to either a single subscription or shared. SQL Database reserved capacity is initially available to single databases and elastic pools, with managed instance support coming later.

Linux on Azure App Service Environment now generally available - Linux on Azure App Service Environment (ASE) combines the features from App Service on Linux and App Service Environment. Take advantage of deploying Linux and containerized apps in an App Service Environment, which is ideal for deploying applications into a VNet for secure network access or apps running at a high scale. With Linux on ASE, you can deploy your Linux web applications into an Azure virtual network (VNet) by bringing your own custom container, or just bring your code by using one of our built-in images. Effective July 30, 2018, Linux and containerized apps deployed in an App Service Environment have returned to regular App Service on Linux and App Service Environment pricing. Linux on ASE is now available in all App Service on Linux’s 20+ regions/countries.

Also generally available

The Azure Podcast

The Azure Podcast: Episode 241 - Service Fabric & Service Fabric Mesh - Deep Kapur, a Microsoft PM on the Azure team, gives us an great refresher on Service Fabric and breaks down the new Service Fabric Mesh offering for us.

News and updates

Ethereum Proof-of-Authority on Azure - Proof-of-Authority, is more suitable for permissioned networks where all consensus participants are known and reputable. Without the need for mining, Proof-of-Authority is more efficient while still retaining Byzantine fault tolerance. From the Azure portal, you can provision a fully configured blockchain network topology in minutes, using Microsoft Azure compute, networking, and storage services across the globe. Rather than spending hours building out and configuring the infrastructure, we have automated these time-consuming pieces to allow you to focus on building out your scenarios and applications. You are only charged for the underlying infrastructure resources consumed, such as compute, storage, and networking. There are no incremental charges for the solution itself. This solution also comes with Azure Monitor to track node and network statistics.

Security Bulletin for August 2018 - Microsoft is aware of a temporary denial of service (DoS) vulnerability (CVE-2018-5390) affecting the Linux Kernel. Virtual Machines running Linux may be vulnerable. The Azure Host platform remains secure from this vulnerability. We are working with various Linux distributions to ensure that they address this security issue.

Speech Devices SDK and Dev Kits news (August 2018) - Speech Devices SDK v0.5.0 is now available for download. See this post for details on how to access it. Note that the Speech Devices SDK consumes the Speech SDK, so be sure to use the latest version of the Speech Devices SDK and its matching sample app. This SDK only supports Java at this time. The microphone array dev kits from our hardware provider Roobo went on sale recently. This post also has details on how to order, but delivery is currently only available to the USA and China.

Photo of the Roobo Smart Audio Dev Kit

New locations for Azure CDN now available - Azure CDN provides a world class platform to let you reduce load times, save bandwidth, and speed responsiveness across your businesses diverse workflows. Azure CDN from Microsoft enables Azure customers to use and deliver content from the same global CDN network leveraged by Microsoft properties such as Office 365, Skype, Bing, OneDrive, Windows, and Xbox. Eight additional point-of presence (POP) locations include three new countries to our global coverage.

Azure Data Factory Visual tools now supports GitHub integration - You can now integrate your Azure Data Factory (ADF) with GitHub. The ADF visual authoring integration with GitHub allows you to collaborate with other developers, do source control, versioning of your data factory assets (pipelines, datasets, linked services, triggers, and more). ADF-GitHub integration allows you to use either public Github or GitHub Enterprise depending on your requirements. You can use OAuth authentication to login to your GitHub account.

Azure Friday | Azure Data Factory visual tools now integrated with GitHub - Gaurav Malhotra joins Lara Rubbelke to discuss how you can associate a GitHub repository (public & enterprise) to your Azure Data Factory for collaboration, versioning, source control.

Enhance security and simplify network integration with Extension Host on Azure Stack - We’re bringing the Extension Host solution to Azure Stack so that only have to open one port (443). This solution is already available on Azure, allowing all requests to be funneled through one port, reducing the ports that need to be opened on the firewall, and allowing customers to communicate with these end points via proxy servers. In its first release, the User and Admin portal default extensions have moved to this model, thereby reducing the number of ports from 27 to one. Over time, additional services such as the SQL and MySQL providers will also be changed to use the Extension Host model. Note that Extension Host requires two wildcard SSL certificates, one for the Admin portal and one for the Tenant portal. The Azure Stack Readiness Checker tool will validate these certificates.

Additional news and updates

Azure Friday

Azure Friday | Provisioning Kubernetes clusters on AKS using HashiCorp Terraform - Anubhav Mishra (Developer Advocate, HashiCorp), joins Scott Hanselman to discuss how to use HashiCorp Terraform to create & manage Kubernetes clusters in Azure using Azure Kubernetes Service (AKS). Mishra further explains the benefits of using Terraform to provision Azure infrastructure and demonstrates how to configure a Kubernetes cluster on AKS.

Azure Friday | Siphon on HDInsight Kafka - Thomas Alex joins Lara Rubbelke to discuss how Microsoft uses Apache Kafka for HDInsight to power Siphon, a data ingestion service for internal use. Apache Kafka for HDInsight is an enterprise-grade, open-source, streaming ingestion service. Microsoft created Siphon as a highly available and reliable service to ingest massive amounts of data for processing in near real time. Siphon handles ingestion of over a trillion events per day across multiple business- critical scenarios at Microsoft. In this episode, learn how Siphon uses Apache Kafka for HDInsight as its scalable pub/sub message queue.

Technical content and training

The Developer’s Guide to Microsoft Azure eBook - August update is now available - There's a new and updated Developer’s Guide to Microsoft Azure eBook from the team of Michael Crump (@mbcrump) and Barry Luijbregts (@azurebarry). Featuring extensive updates since the last update, the new eBook is designed to help you get up to speed with Azure in the shortest time possible and includes practical real-world scenarios. In addition to the PDF, this version is also available in EPUB and Mobi editions, so be sure to click the text link under the cover image for all 3 formats.

Cover image of The Developer's Guide to Microsoft Azure ebook

Azure HDInsight Interactive Query: Ten tools to analyze big data faster - Customers use HDInsight Interactive Query (also called Hive LLAP, or Low Latency Analytical Processing) to query data stored in Azure storage & Azure Data Lake Storage in super-fast manner. Interactive query makes it easy for developers and data scientist to work with the big data using BI tools they love the most. HDInsight Interactive Query supports several tools to access big data, which are cataloged in this blog post.

Getting started with IoT: driving business action through analytics and automation - Whether you're dealing with real-time or historical data, or you need to automate a function, Azure IoT has the tools to meet your needs. Learn which Azure IoT offerings can best support your use case whether you need to do near real-time or historical data analysis. Either way, be sure to explore the Building IoT solutions with Azure: A Developer's Guide, which is linked to from this post.

Now Available: Azure Sphere technical documentation - Azure Sphere documentation covers a range of topics from architecture, key concepts, developing applications, and deploying those applications over-the-air to devices in the field. Also included is a QuickStart guide, tutorials and a selection of best practices – all of which will help you rapidly get up and running with your dev kits. Check this post for links to the docs and to order a dev kit today.

How Microsoft drives exabyte analytics on the world’s largest YARN cluster - YARN is known to scale to thousands of nodes, but what happens when you need to tens of thousands of nodes? The Cloud & Information Service at Microsoft is a highly specialized team of experts that work on applied research and science initiatives focusing on data processing and distributed systems. This blog post explains how CISL and the Microsoft Big Data team met the challenge of complex scale and resource management – and ended up implementing the world's largest YARN cluster to drive its exabyte-sized analytics.

Learn how to orchestrate serverless functions by scraping APIs in 8 minutes - In this post, Maxime Rouiller explains how he retrieves metadata from over 900 samples in the Azure Samples GitHub repo to validate their last commit date. He refactors a local console application built using the Octokit library into a fan-out/fan-in durable function that orchestrates obtaining the list of samples and then collecting the metadata from each sample in parallel using Azure Functions.

Azure tips & tricks

2018-08-13_12-15-25

How to use keyboard shortcuts in the Azure portal - In this edition of Azure Tips and Tricks, you'll learn how to use keyboard shortcuts that are available in the Azure portal. Keyboard shortcuts allow you to complete actions and navigate without having to take your hands off of the keyboard, making it easier to work in the portal.

Events

Microsoft Ignite – now with more code - Ignite 2018 is going to have more developer-focused content than ever before. If you're a developer, check out why you should be heading to Orlando in September.

Microsoft Azure at SIGGRAPH 2018 - Microsoft is at SIGGRAPH 2018 in Vancouver, BC, Canada this week. If you're there, come learn more about Azure’s intelligent cloud solutions that produce actionable insights, accelerate media delivery, and inspire your team to deliver new experiences. Join Microsoft in Booth 620 to learn how Microsoft Azure and our partners—including Avid, Nimble Collective and Teradici—can help you transform your media workflows and audience experiences.

IoT In Action webinar series: Agriculture - Join Microsoft’s IoT In Action Agriculture Webinar to learn how we’ve enabled precision farming through FarmBeats and partnerships with solution integrators. We believe that collected and predicted data, coupled with the farmer’s knowledge about the farm, can help increase farm productivity and help reduce costs. You won’t want to miss learning how Microsoft’s FarmBeats project is building several unique solutions and using machine learning to solve problems via combination of various data from low-cost sensors, drones and vision.

e813cb70-c697-4f15-9c2b-a934015b9b50_960

The IoT Show | IoT In Action - The Next Agricultural Revolution - Collected and predicted data, coupled with the farmer's knowledge about the farm, can help increase farm productivity and reduce costs. Microsoft already enabled precision farming through FarmBeats and partnerships with solution integrators leveraging the Azure IoT platform and solutions.

Service Fabric Community Q&A 27th Edition - The Service Fabric team will be holding their 27th monthly community Q&A call on Thursday, August 16 at 4pm Pacific Time. Their teammates in Denmark and India will host a Q&A earlier that day at 1130 UTC to accommodate other time zones.

Twitter AMA with the App Service Team #AppServiceAMA!! - The Azure App service team will host a special Ask Me Anything (AMA) session on Twitter, Wednesday, August 22, 2018 from 9am to 11am Pacific Time. You can tweet to @AzureSupport with #AppServiceAMA with your questions about the service.

The IoT Show

c531e9c4-e56e-4c8b-be58-a922013b90b5_960

The IoT Show | IoT Developers and the MVP Program - What is the Most Valuable Professional (MVP) program about and how can you, as an IoT developer, be nominated or nominate someone awesome to become an MVP? Ranga Vadlamudi tells us all about the IoT MVP program on the IoT Show.

Customers and partners

Enhance your DevSecOps practices with Azure Security Center’s newest playbooks - Cloud-hosted workloads offer excellent scalability, ease of deployment, and pre-secured infrastructure for your workloads. However, the workloads themselves may still be susceptible to attack by cybercriminals. Avyan Consulting partnered with the Azure Security Center team to build attack simulation playbooks for demonstration and training purposes. Azure administrators may use these playbooks to deploy fully operational web and Compute workloads, security management tools such as Azure Security Center & Web App Firewalls (WAFs), and SQL threat protection. These tools work with Cloudneeti from Avyan Consulting to provide rich compliance views across Azure resources.

Industries

Accelerate healthcare initiatives with Azure UK NHS blueprints - To rapidly acquire new capabilities and implement new solutions, healthcare IT and developers can now take advantage of industry-specific Azure Blueprints. These are packages that include reference architectures, guidance, how-to guides and other documentation, and may also include executable code and sample test data built around a key use case of interest to healthcare organizations. Blueprints also contain components to support privacy, security, and compliance initiatives, including threat models, security controls, responsibility matrices, and compliance audit reports. This post focuses on four new blueprints created to support UK National Health Service (NHS) cloud initiatives, as well as an existing blueprint for AI in healthcare that includes HIPAA and HITRUST compliance support, which can be used worldwide with suitable customization.

A Cloud Guru's This Week in Azure

2018-08-13_12-25-48

Azure This Week - 10 August 2018 - This time on Azure This Week, Lars goes through four new services for the Azure platform: Manual failover for IoT Hub cloud service, Instance size flexibility for Azure Reserved Virtual Machine Instances, the general availability of Azure management groups and the general availability of Linux which is now on the App Service Environment.

Visual Studio Code July 2018

Advisory serializing/deserializing a CultureAwareComparer with .NET Framework 4.6+

$
0
0

An issue with serializing/deserializing a CultureAwareComparer was discovered with .NET Framework 4.7.1. This issue was resolved earlier this year, with the May 2018 Preview of Quality Rollup. We recently learned that some applications continue to experience this issue. The changes made to .NET Framework 4.7.1 were also integrated into .NET Framework 4.6 and later servicing releases. As a result, this advisory applies to .NET Framework 4.6 and later versions.

This advisory is provided so that developers can correctly diagnose this issue and be aware of our guidance. We will continue to update this issue and dotnet/announcements #81 as we have new information to share.

Guidance

This guidance only applies to customers that have experienced this issue.

We intend to release an update for .NET Framework 4.6 and later releases that resolves this issue within the next month. When that happens, you must deploy the .NET Framework update for this issue to all .NET Framework 4.6+ machines that communicate via serialized objects. After doing that, cross-machine serialization will work as expected.

Please consult your software vendor to understand if this impacts the software you are using.

Technical Context

In the May 2018 Preview of Quality Rollup, we made a change to the internal type CultureAwareComparer to implement ISerializable. We made the change in order to address issue with serializing/deserializing a CultureAwareComparer.

The issue reported was that a type that contains an instance of CultureAwareComparer was not able to correctly serialize and deserialize across different versions of the .NET Framework. There are a few types in .NET Framework that do this, the most commonly impacted one being the Dictionary<TKey, TValue> type.

The change to implement the ISerializable interface had unintended side effects when exchanging this type across different machines. We have had a small number of reports from customers who have been affected.

A common, but not unique way, of exchanging objects across machines is to use WCF in conjunction with a serializer. One of the common serializers being DataContractSerializer. The side-effect of the change prevented data serialized on systems where the type implements ISerializable to correctly deserialize of systems where the type does NOT implement ISerializable. The same is true of the opposite case, that systems where the type does not implement ISerializable will not correctly deserialize data from systems where the type does implement ISerializable.

We have an updated fix that we intend to release within the next month that will remove the implementation of the ISerializable interface on the type and use a different internal mechanism for ensuring the cross-version serialization continues to work as expected. This change addresses the break experienced with WCF.

The upcoming fix will remove the implementation of the ISerializable interface on the type and use a different mechanism for ensuring the cross-version serialization of that type continues to work. This change addresses the break experienced with WCF.

Symptoms

An application that is affected by this issue will most likely crash when trying to deserialize an object. It may print or log an error message similar to the following one:

The formatter threw an exception while trying to deserialize the message: 
There was an error while trying to deserialize parameter . The InnerException message was ''EndElement' 'Comparer' from namespace '' is not expected. Expecting element '_compareInfo'.'.  Please see InnerException for more details.

Azure Site Recovery powers Veritas Backup Exec Instant Cloud Recovery for DR

$
0
0

Microsoft Azure Site Recovery (ASR) offers Disaster Recovery as a Service (DRaaS) for applications running in Azure and on-premises. Using ASR, you can reduce application downtime during IT interruptions, without compromising compliance. ASR provides comprehensive coverage for on-premises applications across Linux, Windows, VMware and Hyper-V virtual machines, and physical servers.

Azure Site Recovery now powers Veritas Backup Exec Instant Cloud Recovery (ICR) with Backup Exec 20.2 release. With ICR, Backup Exec users can easily configure continuous replication for on-premises virtual machines to Azure for a quick failover in case of a disruption on-premises. Backup Exec customers can minimize Recovery Point Objective (RPO) and Recovery Time Objective (RTO) for business-critical applications realizing the benefits of Azure as dynamic disaster recovery platform using ICR.

Instant Cloud Recovery requires an Azure subscription and supports VMware and Hyper-V Virtual Machines. ICR provides a summary view, monitoring, replication health details, and key actions directly from the familiar Backup Exec console. You can go to Azure portal for advanced monitoring and failover operations.

be-console

Backup Exec Instant Cloud Recovery to Azure is available in all Azure regions where ASR is available. Get started by downloading the Backup Exec 20.2 trial version.

Related links and additional content

C++ development with Docker containers in Visual Studio Code

$
0
0

Containers allow developers to package up an application with all the parts it needs, such as libraries and other dependencies, and ship it all out as one image. This is especially useful for C++ cross-platform development – with containers you can choose to target a platform that runs on a completely different operating system than your developer machine. And even if they are running on the same OS, the container technology ensures that the application will run on any other machines regardless of any customized settings that machine might have that could differ from the machine used for writing and testing the code.

Docker is a very popular container platform that makes it easy to create, deploy, and run applications by using containers, and whether you are a seasoned Docker developer or just getting started, Visual Studio Code has great support for working with Docker containers inside the editor.

In this blog post, we are going to walk through how to create a Docker image for C++, start Docker containers, and build and run a C++ “HelloWorld” app in the container using Visual Studio Code.

Install tools

First, let’s get the tools you would need in this walkthrough:

  1. Install Docker on your machine: for Mac, for Windows, or for Linux.
  2. Install Visual Studio Code: https://code.visualstudio.com/.
  3. Docker support for VS Code is provided by an extension. To install the Docker extension, open the Extensions view in VS Code and search for docker to filter the results. Select the Microsoft Docker extension, click on Install, and reload VS Code when completed.

Get our C++ “HelloWorld” app ready

Let’s first work on a C++ “HelloWorld” app, which we’re going to build and run in a Docker container later in this walkthrough. First, create a new folder on your disk and open it in VS Code. Then add a new file named Test.cpp in the folder. Your VS Code window should look like this:

Now let’s put the following content into the Test.cpp file. Save the file and we are done here.

#include <iostream>

int main(int argc, char const *argv[])
{
  std::cout << "Hello Docker container!" << std::endl;
  return 0;
}

Build a Docker image for C++ development

With Docker, you can build images by specifying the step by step commands needed to build the image in a Dockerfile. A Dockerfile is just a text file that contains the build instructions.

1. Create a Dockerfile. The VS Code Docker extension provides a command for adding a Dockerfile to your workspace. With the folder that we created in the previous section opened in VS Code, in the Command Palette (F1), select command Docker: Add Docker files to Workspace to generate Dockerfiledocker-compose.ymldocker-compose.debug.yml, and .dockerignore files for your workspace type.

You will be prompted with a list of platforms or languages you could target. In this walkthrough, let’s choose Other from the list, which will give us a generic Dockerfile from which we can build our C++ specific image.

2. Add build instructions to the Dockerfile. Now we are going to edit the generated Dockerfile with instructions for how the Docker image should be built. The Docker extension enables a great experience for editing Dockerfile by providing auto-completion suggestions, snippets, QuickInfo, and linting for the Docker commands.

In the following sections, we will describe how to build a C++ development environment with GCC and Clang as the compiler. Depending on which compiler you choose, you only need to follow one of the two sections (2.1 or 2.2).

2.1 Build a Docker image for a Clang environment

For building a C++ development environment with the Clang compiler, let’s copy the following content into the Dockerfile, which installs Clang on top of a base Ubuntu image.

# Get the base Ubuntu image from Docker Hub
FROM ubuntu:latest

# Update apps on the base image
RUN apt-get -y update && apt-get install -y

# Install the Clang compiler
RUN apt-get -y install clang

# Copy the current folder which contains C++ source code to the Docker image under /usr/src
COPY . /usr/src/dockertest1

# Specify the working directory
WORKDIR /usr/src/dockertest1

# Use Clang to compile the Test.cpp source file
RUN clang++ -o Test Test.cpp

# Run the output program from the previous step
CMD ["./Test"]

2.1 Build a Docker image for a GCC environment

You can follow similar steps as above to build an image that installs GCC instead of Clang. Or, you could use a base image that has GCC pre-installed to simplify the steps. Let’s copy the following into the Dockerfile to use an image pre-installed with GCC:

# Get the GCC preinstalled image from Docker Hub
FROM gcc:4.9

# Copy the current folder which contains C++ source code to the Docker image under /usr/src
COPY . /usr/src/dockertest1

# Specify the working directory
WORKDIR /usr/src/dockertest1

# Use GCC to compile the Test.cpp source file
RUN g++ -o Test Test.cpp

# Run the program output from the previous step
CMD ["./Test"]

3. Build the Docker image. Now that we have the build instructions in the Dockerfile, we can go ahead build the Docker image. Docker: Build Image is one of the Docker commands the Docker extension provides in the Command Palette.

Another way to invoke the Docker Build command is to right click on the Dockerfile itself in the VS Code Explorer window and select Build Image.

You will then get a prompt asking for the name and version to tag the image.

While the build is in progress, you can see in the VS Code Terminal window how each step specified in the Dockerfile is being executed by Docker. See the following screenshot as an example.

Once completed, you can find the built image under the Images node in the Docker tab in VS Code. The following screenshot shows the image we just built appearing at the top of the image list.

One thing to note is that our base image “Ubuntu:latest” is also in the list. This means it could be reused as the base for other images to be built on top of, with only the image difference stored, which is a huge disk space saving compared with spinning up multiple full VMs.

Run Docker containers

Now that we have an image, let’s try to run it. There’re two ways to run an image using the Docker extension.  Run will execute the image and run the default CMD you specified above. In this case we will Run Interactive which will execute the CMD but also give us an interactive terminal which we can use to see what is going on inside the running container.

In the Docker:Explorer window, we see the container we just started is running:

In the Terminal window, we can see the default CMD we specified in the Dockerfile, which is to run our Test app (CMD [“./Test”]), has been executed by Docker and we got the output just as expected.

Wrap-up

In this blog post, we walked through step by step for how to use the VS Code Docker extension to build Docker images, start Docker containers, and build and run C++ programs in the container. In a future post we’ll cover how to attach to a process in the container and debug it. If you have any suggestions for future topics regarding using C++ with containers, please let us know by leaving comments below.

Also, if you are using containers or doing cloud development with C++ we would love to hear from you. If you could take a few minutes to take our C++ cloud and container development survey, it will help us focus on topics that are important to you on the blog and in the form of product improvements.


Python in Visual Studio 2017 version 15.8

$
0
0

We have released the 15.8 update to Visual Studio 2017. You will see a notification in Visual Studio within the next few days, or you can download the new installer from visualstudio.com.

In this post, we're going to look at some of the new features we have added for Python developers: IntelliSense with type shed definitions, faster debugging, and support for Python 3.7. For a list of all changes in this release, check out the Visual Studio release notes.

Faster debugging, on by default

We first released a preview of our ptvsd 4.0 debug engine in the 15.7 release of Visual Studio, in the 15.8 release this is now the default, offering faster and more reliable debugging for all users.

If you encounter issues with the new debug engine, you can revert back to the previous debug engine by selecting Use legacy debugger from Tools > Options > Python > Debugging.

Richer IntelliSense

We are continuing to make improvements to IntelliSense for Python in Visual Studio 2017. In this release you will notice completions that are faster, more reliable, and have better understanding of the surrounding code, and tooltips with more focused and useful information. Go To Definition and Find All References are better at taking you to the module a value was imported from, and Python packages that include type annotations will provide richer completions. These changes were made as part of our ongoing effort to make our Python analysis from Visual Studio available as an independent Microsoft Python Language Server.

As an example, below shows improved tooltips with richer information when hovering over the os module in 15.8 compared to 15.7:

We have also added initial support for using typeshed definitions to provide more completions for places where our static analysis is unable to infer complete information. We are still working through some known issues with this though, so results may be limited and expect to see better support for typeshed in future releases.

Support for Python 3.7

We have updated our Visual Studio so that all of our features work with Python 3.7, which was recently released. Most functionality of Visual Studio works with Python 3.7 in the 15.7 release, and in the 15.8 release we made specific fixes so that debug attach, profiling, and mixed-mode (cross-language) debugging features work with Python 3.7.

Give Feedback

Be sure to download the latest version of Visual Studio and try out the above improvements. If you encounter any issues, please use the Report a Problem tool to let us know (this can be found under Help, Send Feedback) or continue to use our GitHub page. Follow our Python blog to make sure you hear about our updates first, and thank you for using Visual Studio!

August 2018 Security and Quality Rollup

$
0
0

Today, we are releasing the August 2018 Security and Quality Rollup.

Security

CVE-2018-8360 – Windows Information Disclosure Vulnerability

This update resolves an information disclosure vulnerability in Microsoft .NET Framework that could allow an attacker to access information in multi-tenant environments. The vulnerability is caused when .NET Framework is used in high-load/high-density network connections in which content from one stream can blend into another stream.

To exploit the vulnerability, an attacker who can access one tenant in a high-load/high-density environment could potentially trigger multi-tenanted data exposure from one customer to another.

This security update addresses the vulnerability by correcting the way that .NET Framework handles high-load/high-density network connections.

CVE-2018-8360

Quality and Reliability

This release contains the following quality and reliability improvements.

CLR

  • Applications that rely on COM components were failing to load or run correctly because of “access denied”, “class not registered”, or “internal failure occurred for unknown reasons” errors described in 4345913 and Blog Advisory.  [651528]

Note: Additional information on these improvements is not available. The VSTS bug number provided with each improvement is a unique ID that you can give Microsoft Customer Support, include in StackOverflow comments or use in web searches.

Getting the Update

The Security and Quality Rollup is available via Windows Update, Windows Server Update Services, Microsoft Update Catalog, and Docker.

Microsoft Update Catalog

You can get the update via the Microsoft Update Catalog. For Windows 10, .NET Framework updates are part of the Windows 10 Monthly Rollup.

The following table is for Windows 10 and Windows Server 2016+.

Product Version Security and Quality Rollup KB
Windows 10 1803 (April 2018 Update) Catalog
4343909
.NET Framework 3.5 4343909
.NET Framework 4.7.2 4343909
Windows 10 1709 (Fall Creators Update) Catalog
4343897
.NET Framework 3.5 4343897
.NET Framework 4.7.1 4343897
Windows 10 1703 (Creators Update) Catalog
4343885
.NET Framework 3.5 4343885
.NET Framework 4.7, 4.7.1 4343885
Windows 10 1607 (Anniversary Update)
Windows Server 2016
Catalog
4343887
.NET Framework 3.5 4343887
.NET Framework 4.6.2, 4.7, 4.7.1 4343887
Windows 10 1507 Catalog
4343892
.NET Framework 3.5 4343892
.NET Framework 4.6, 4.6.1, 4.6.2 4343892

The following table is for earlier Windows and Windows Server versions.

Product Version Security and Quality Rollup KB Security Only Update KB
Windows 8.1
Windows RT 8.1
Windows Server 2012 R2
Catalog
4345592
Catalog
4345681
.NET Framework 3.5 4344153 4344178
.NET Framework 4.5.2 4344147 4344171
.NET Framework 4.6, 4.6.1, 4.6.2, 4.7, 4.7.1, 4.7.2 4344145 4344166
Windows Server 2012 Catalog
4345591
Catalog
4345680
.NET Framework 3.5 4344150 4344175
.NET Framework 4.5.2 4344148 4344172
.NET Framework 4.6, 4.6.1, 4.6.2, 4.7, 4.7.1, 4.7.2 4344144 4344165
Windows 7
Windows Server 2008 R2
Catalog
4345590
Catalog
4345679
.NET Framework 3.5.1 4344152 4344177
.NET Framework 4.5.2 4344149 4344173
.NET Framework 4.6, 4.6.1, 4.6.2, 4.7, 4.7.1, 4.7.2 4344146 4344167
Windows Server 2008 Catalog
4345593
Catalog
4345682
.NET Framework 2.0, 3.0 4344151 4344176
.NET Framework 4.5.2 4344149 4344173
.NET Framework 4.6 4344146 4344167

Docker Images

We are updating the following .NET Framework Docker images for today’s release:

Note: Look at the “Tags” view in each repository to see the updated Docker image tags.

Previous Monthly Rollups

The last few .NET Framework Monthly updates are listed below for your convenience:

Microsoft R Open 3.5.1 now available

$
0
0

Microsoft R Open 3.5.1 has been released, combining the latest R language engine with multi-processor performance and tools for managing R packages reproducibly. You can download Microsoft R Open 3.5.1 for Windows, Mac and Linux from MRAN now. Microsoft R Open is 100% compatible with all R scripts and packages, and works with all your favorite R interfaces and development environments.

This update brings a number of minor fixes to the R language engine from the R core team. It also makes available a host of new R packages contributed by the community, including packages for downloading financial data, connecting with analytics systems, applying machine learning algorithms and statistical models, and many more. New R packages are released every day, and you can access packages released after the 1 August 2018 CRAN snapshot used by MRO 3.5.1 using the checkpoint package.

We hope you find Microsoft R Open useful, and if you have any comments or questions please visit the Microsoft R Open forum. You can follow the development of Microsoft R Open at the MRO Github repository. To download Microsoft R Open, simply follow the link below.

MRAN: Download Microsoft R Open

Visual Studio 2017 version 15.8

$
0
0

Today we are releasing Visual Studio 2017 version 15.8. In this version we have focused on productivity, performance and bug fixes. There are many new features you‘ll find useful, but in this post, I’ll underscore the highlights you may be most interested in. For the complete list of all the updates in today’s release, check out the Visual Studio 2017 version 15.8 release notes and list of bugs submitted by you that are fixed. If you prefer to first try these updates without installing the release, check out the Visual Studio images in Azure.

A couple of notable additions to try:

  • Multi-caret editing improvements
  • Faster git branch switching
  • Faster unit test execution
  • TypeScript 3.0 support

Read on for more details.

Productivity

This release adds notable productivity and debugging enhancers.

Multi-caret editing. Editing multiple locations in a file, simultaneously, is now easy. Start by creating insertion points and selections at multiple locations in a file with multiple caret support. This will then allow you to add, edit, or delete text in multiple places simultaneously.

  • Insert carets with Ctrl + Alt + LeftMouseClick
  • Add a selection and caret at next location that matches current selection with Shift + Alt + Ins.
  • See Edit > Multiple Carets for full list of actions.

Contextual Navigation. You can now access a contextual navigation menu with the shortcut Alt + `.

New keybinding profiles for Visual Studio Code and ReSharper (Visual Studio). Speaking of shortcuts, you can now keep your keybindings consistent with two new keyboard profiles: Visual Studio Code and ReSharper (Visual Studio). You can find these schemes under Tools > Options > General > Keyboard and the top drop-down menu.

Keyboard mappings

New commands and improvements to the Go To All window:

  • Go to Enclosing Block (Ctrl + Alt + UpArrow) allows you to quickly navigate up to the beginning of the enclosing code block.
  • Go to Next/Previous Issue (Alt + PgUp/PgDn) allows you to skip to the next/previous issue (error, squiggle, lightbulb).
  • Go to Member (Ctrl + T, M) is now scoped to the file by default. You can change the default back to solution by toggling the Scope to Current Document (Ctrl + Alt + C).

And… more refactorings and quick actions using Ctrl + . or Alt + Enter:

  • Invert If enables you to invert your logic in if-else statements. Place your cursor in the if keyword to trigger this refactoring.
  • Add parameter from method callsite allows you to add a parameter to a method by adding an argument to a method callsite and triggering Quick Actions and Refactorings.
  • Remove unnecessary parentheses removes parentheses around binary operators that are not essential for compilation. You can configure this style rule through Tools > Options > Text Editor > C# > Code Style > General or .editorconfig:
    • dotnet_style_parentheses_in_arithmetic_binary_operators
    • dotnet_style_parentheses_in_relational_binary_operators
    • dotnet_style_parentheses_in_other_binary_operators
    • dotnet_style_parentheses_in_other_operators
  • Use ternary conditionals in assignments and return statements can also be configured as a style rule in Tools > Options > or through .editorconfig:
    • dotnet_style_prefer_conditional_expression_over_assignment
    • dotnet_style_prefer_conditional_expression_over_return

Select develop vs Debug instance of Visual Studio. When you have more than one instance of Visual Studio 2017 installed, you can now select which instance to deploy your extension to when debugging. That way you can, for example, develop in Visual Studio release channel while debugging in the preview channel.

Performance

In this release we have continued to focus on performance and have made significant improvements in many areas.

Git branch checkout and branch switching. Git branch checkout and branch switching for C#, VB, and C++ projects is much faster for large solutions since solution reload is no longer required.

Option to not reopen docs from previous session. We got feedback on how in some cases when Visual Studio reopened documents from previous sessions it wasn’t required, and that it would cause perf delays. So in this release we added an option to disable reopening documents that were open in the previous session. You can toggle this option in Tools > Options > Projects > Solutions > General.

Test performance. We significantly improved performance when running a few tests in a large solution with multiple test projects. In our lab, a solution with over 10,000 MSTests executed a single test up to 82% faster!

CPU Usage Tools performance improvements. We have a few notable improvements to highlight regarding the CPU Usage Tool. The CPU Usage tool in the Performance Profiler (ALT-F2) can now start in a paused state, which means that it will not collect any CPU usage sample stack data until it is specifically enabled. This makes the resultant amount of data much smaller to collect and analyze, in turn making your performance investigations more efficient. Once you start the target application, a monitoring display will show the CPU utilization graph and will allow you to control the CPU profiling and enable/disable sample data collection as many times as you like.

Pause / Resume Collection of CPU Usage Data

.NET Object Allocation Tracking Tool. The .NET Object Allocation Tracking Tool joins the family of tools available from the Performance Profiler (ALT-F2). Invoking this tool for a performance profiler session causes the collection of a stack trace for every .NET object allocation that occurs in the target application. This stack data is analyzed along with object type and size information to reveal details of the memory activity of your application. You can quickly determine the allocation patterns in your code and identify anomalies as well. In addition, for Garbage Collection events, you can easily determine which objects were collected and which were retained, quickly determining object types which dominate the memory usage of your application. This is especially useful for API writers to help minimize allocations. While your test application is executing, the Performance Profiler displays a monitoring view with a line graph of Live Objects (count), as well as an Object Delta (% change) bar graph.

The .NET Object Allocation Tracking Tool

C++ Development

C++11 standards conformance. A new, experimental, token-based preprocessor that conforms to C++11 standards (including C99 preprocessor features), enabled with /experimental:preprocessor switch. This will be controlled with macro _MSVC_TRADITIONAL, which will be defined to 1 when using the traditional preprocessor and 0 when using the new experimental standards conformant preprocessor.

CMake. Adding configurations to CMakeSettings.json is now as simple as selecting a template.

C++ Just My Code. C++ Just My Code debugging enables you now to step-over code from system or 3rd party C++ libraries in addition to collapsing those calls in the call-stack window. You get to control this behavior for any C++ libraries when your code is compiled with /JMC (the default for Debug configurations) and the non-user libraries paths are specified in a .natjmc file. If the system library calls into user-code, when you step in, the debugger will skip all system code and will stop on the first line of user-code callback.

Code analysis experience. Code analysis can now run in the background when files are opened or saved, and results will be displayed in the error list and as green squiggles in the editor. You can enable the new, in-progress features under Tools > Options > Text Editor > C++ > Experimental > Code Analysis.

Code analysis results displayed in the error list and as green squiggles in the editor

F# 4.5 and F# Tools for Visual Studio

F# language version 4.5. In this release we are introducing the F# language version 4.5. This also corresponds with the new 4.5.x family of FSharp.Core (the F# core library). With this come many improvements to the F# compiler. You can read the specs for each of these changes in the F# RFC repository.

F# Tools for Visual Studio. Some notable improvements include IntelliSense performance, transactional brace completion, an experimental CodeLens implementation, and may bug fixes contributed by the community. A community-driven effort to analyze and improve IntelliSense performance for very large files was contributed by Vasily Kirichenko, Steffen Forkmann, and Gauthier Segay. IntelliSense in very large files (10k+ lines of code) is roughly twice as fast now. Automatic, transactional brace completion is now available for (), [], {}, [||], and [<>] brace pairs. We did this work in collaboration with Gibran Rosa. There is now an experimental CodeLens implementation, contributed by Victor Peter Rouven Müller. You can turn it on in Options > Text Editor > F# > Code Lens. Check out the release notes to see the many other bug fixes and improvements to F# Tools for Visual Studio with this release.

JavaScript and TypeScript Tooling

TypeScript 3.0. This version of Visual Studio includes TypeScript 3.0 by default.

Support for the Vue.js library has been improved, and in particular support for .vue files, also known as “single file components”. If the Node.js workload is installed, there will now be “Basic Vue.js Web Application” templates under the “JavaScript / Node.js” or “TypeScript / Node.js” paths in the New Project dialog. The below shows an example of editing TypeScript code inside a script block in a .vue file.

ESLint support. ESLint support has been reimplemented in this release. So, Visual Studio will now lint JavaScript files as you edit. ESLint has been updated to use ESLint 4 by default, but if your project has a local installation of ESLint, it will use that version instead. You can easi;y disable ESLint globally in VS by unchecking the “Enable ESLint” setting in the “Tools / Options” dialog in the location shown below.

Check out the TypeScript 3.0 release announcement for all the details.

Visual Studio Web Tools

Library Manager. Library Manager is a new feature included in Visual Studio 2017. It helps you manage client-side libraries in your web projects.

Single project Docker Container. We added a new single project Docker container experience for ASP.NET Core web projects. This supplements the existing Docker Compose-based container tooling and provides a simpler, easier way to create, debug, and build Docker containers right from Visual Studio.

Mobile Development for Android

Support for Google Android Emulator. This release adds support for the Google Android emulator that is compatible with Hyper-V when running on the Windows 10 April 2018 Update. This enables you to use Google’s Android emulator side-by-side with other Hyper-V based technologies, including Hyper-V virtual machines, Docker tooling, the HoloLens emulator, and more. Mobile app developers who use Hyper-V now have access to a fast Android emulator that always supports the latest Android APIs, works with Google Play Services out of the box, and supports all features of the Android emulator, including camera, geolocation, and Quick Boot.

Keyboard mappings

Xamarin.Android Designer. We made significant improvements to the designer experience for Xamarin.Android. The highlight being, a split-view editor was introduced which allows you to create, edit, and preview your layouts at the same time

Keyboard mappings

.NET and ASP.NET

.NET Core SDK 2.1.400. Visual Studio 15.8 includes .NET Core SDK 2.1.400. New SDK features include added NUnit templates, added support for signed global tools, and improved help text for better clarity

ASP.NET .NET Framework Secrets Support. For ASP.NET, .NET Framework projects that target .NET Framework 4.7.1 or higher, you can now open and store secrets you do not want in your source code in usersecrets.xml by right clicking on the project and selecting “Managed User Secrets”.

.NET Framework 4.7.2. Visual Studio 2017 version 15.8 now offers the .NET Framework 4.7.2 development tools to all supported platforms with the 4.7.2 runtime included. The .NET Framework 4.7.2 offers several new features and improvements as well as numerous reliability, stability, security, and performance fixes. You can find more details about the .NET Framework 4.7.2 in these articles:

Share Your Feedback

As always, we want to know what you think. Please install Visual Studio 2017 version 15.8 and share your thoughts and concerns.

Please let us know any issues you have via the Report a Problem tool in Visual Studio. You can track your issues in Visual Studio Developer Community where you can ask questions and find answers. You can also engage with us and other Visual Studio developers through our new Gitter community (requires GitHub account), make a product suggestion through UserVoice, or get free installation help through Live Chat Support.

Thanks,

John

John Montgomery, Director of Program Management for Visual Studio
@JohnMont

John is responsible for product design and customer success for all of Visual Studio, C++, C#, VB, JavaScript, and .NET. John has been at Microsoft for 17 years, working in developer technologies the whole time.

#Contest Microsoft AI Idea Challenge – AI’s next breakthrough is you!

$
0
0

image

All of us have creative ideas – ideas that can improve our lives and the lives of thousands, perhaps even millions of others. But how often do we act on turning those ideas into a reality? Most of the time, we do not believe in our ideas strongly enough to pursue them. Other times we feel like we lack a platform to build out our idea or showcase it. Most good ideas don’t go beyond those initial creative thoughts in our head.

If you’re a professional working in the field of artificial intelligence (AI), or an aspiring AI developer or just someone who is passionate about AI and machine learning, Microsoft is excited to offer you an opportunity to transform your most creative ideas into reality. Join the Microsoft AI Idea Challenge Contest today for a chance to win exciting prizes and get your project featured in Microsoft’s AI.lab showcase. Check out the rules, terms and conditions of the contest and then dive right in!

The challenge

The Microsoft AI Idea Challenge is seeking breakthrough AI solutions from developers, data scientists, professionals and students, and preferably developed on the Microsoft AI platform and services. The challenge gives you a platform to freely share AI models and applications, so they are reusable and easily accessible. The ideas you submit are judged on the parameters shown in the figure below – essentially half the weight is for the originality of your idea, 20 percent for the feasibility of your solution, and 30 percent for the complexity (i.e. level of sophistication) of your implementation.

 

image 
The Microsoft AI Challenge is accepting submissions between now and October 12, 2018.

To qualify for the competition, individuals or teams are required to submit a working AI model, test dataset, a demo app and a demo video that can be a maximum of three minutes long. We encourage you to register early and upload your projects soon, so that you can begin to plan and build out your solution and turn in the rest of your materials on time. We are looking for solutions across the whole spectrum of use cases – to be inspired, take a look at some of the examples at AI.lab.

Prizes

The winners of the first three places in the contest will respectively receive a Surface Book 2, a DJI Drone, and an Xbox One X.

image

We hope that’s motivation to get you started today – good luck!

Tara

imageimage

Viewing all 10804 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>