Quantcast
Channel: Category Name
Viewing all 10804 articles
Browse latest View live

The week in .NET – On .NET on Docker and new Core tooling, Benchmark.NET, Magicka

$
0
0

Previous posts:

On .NET

In this week’s episode, we’re running ASP.NET in a Docker image, and we look at some of the changes in the .NET Core .csproj tooling. Apologies to those of you who watched live: we had some technical difficulties, and as a consequence, we did a second recording, which is now on Channel 9.

This week, we’ll have Phil Haack on the show. Phil works for GitHub, and before that was the Program Manager for ASP.NET MVC. We’ll stream live on Channel 9. We’ll take questions on Gitter’s dotnet/home channel and on Twitter. Please use the #onnet tag. It’s OK to start sending us questions in advance if you can’t do it live during the shows.

We’ll also record a couple of additional surprise interviews in preparation for the celebration of the 15th anniversary of .NET, and the 20th anniversary of Visual Studio. Stay tuned!

Package of the week: Benchmark.NET

When done properly, benchmarking is a great way to guide your engineering choices by comparing multiple solutions to a problem known to cause performance bottlenecks in your applications. There’s a lot of methodology involved if you want to do it right, however, that is both tricky and repetitive. And no, surrounding your code with a StopWatch won’t cut it.

Benchmark.NET makes it very easy to decorate the code that you want to test so it can be discovered, run many times, and measured. Benchmark.NET takes care of warmup and cooldown periods as needed, and will compute mean running times and standard deviation for you. It can also generate reports in a variety of formats.

Game of the week: Magicka

Magicka is an action-adventure game set in a world based on Norse mythology. Take on the role of a wizard from a sacred order while you embark on a quest to stop an evil sorcerer who has thrown the world into turmoil. Magicka features a dynamic spell casting system that has you combining the elements to cast spells. You can also play with up to three of your friends in co-op and versus modes.

Magicka

Magicka was created by Arrowhead Game Studios using C# and XNA. It is available for Windows on Steam.

User group meeting of the week: C# 7 with Jon Skeet in Adelaide

If you’re around Adelaide on Wednesday, February 8, don’t miss the Adelaide .NET User Group‘s meetup with Jon Skeet on C# 7. Jon Skeet is none other than the #1 member on StackOverflow, and an absolute authority on C#.

.NET

ASP.NET

F#

Check out F# Weekly for more great content from the F# community.

Xamarin

Azure

UWP

Games

And this is it for this week!

Contribute to the week in .NET

As always, this weekly post couldn’t exist without community contributions, and I’d like to thank all those who sent links and tips. The F# section is provided by Phillip Carter, the gaming section by Stacey Haffner, and the Xamarin section by Dan Rigby, and the UWP section by Michael Crump.

You can participate too. Did you write a great blog post, or just read one? Do you want everyone to know about an amazing new contribution or a useful library? Did you make or play a great game built on .NET?
We’d love to hear from you, and feature your contributions on future posts:

This week’s post (and future posts) also contains news I first read on The ASP.NET Community Standup, on Weekly Xamarin, on F# weekly, and on Chris Alcock’s The Morning Brew.


Another Update to Visual Studio 2017 Release Candidate

$
0
0

Thank you for taking time to try out Visual Studio 2017 RC and sharing all your feedback. Today we have another update to Visual Studio 2017 Release Candidate which mostly contains bug fixes. Take a look at the Visual Studio 2017 Release Notes and Known Issues for the full list of what’s changed with this update.

Please try this latest update and share your feedback. For problems, let us know via the Report a Problem option in the upper right corner of the VS title bar. Track your feedback on the developer community portal. For suggestions, let us know through UserVoice.

John Montgomery, Director of Program Management for Visual Studio

@JohnMont is responsible for product design and customer success for all of Visual Studio, C++, C#, VB, .NET and JavaScript. John has been at Microsoft for 18 years working in developer technologies.

Announcing .NET Core Tools Updates in VS 2017 RC

$
0
0

Today, we are releasing updates to the .NET Core SDK, included in Visual Studio 2017 RC. You can also install the .NET Core SDK for use with Visual Studio Code or at the command line, on Windows, Mac and Linux. Check out the Visual Studio blog to learn more about this Visual Studio 2017 update.

The following improvements have been made in the release:

  • Templates — dotnet new has been updated and now is based on a new templating engine.
  • The location of the .NET Standard class library template, in Visual Studio, has been moved to the new .NET Standard node, based on feedback.
  • Quality — ~50 fixes have been made across the tools to improve product reliability.

The quality fixes have been made across the .NET CLI, NuGet, MSBuild and also in Visual Studio. We will continue to squash bugs as we get closer to Visual Studio 2017 RTM. Please continue sharing your feedback on the overall experience.

Getting the Release

This .NET Core SDK release is available in Visual Studio 2017 RC, as part of the .NET Core cross-platform development workload. It is also available in the ASP.NET and web development workload and an optional component of the .NET desktop development workload. These workloads can be selected as part of the Visual Studio 2017 RC installation process. The ability to build and consume .NET Standard class libraries is available in the all of the above workloads and in the Universal Windows Platform development workload.

You can also install the .NET Core SDK release for use with Visual Studio code or with the command-line use on Windows, macOS and Linux by following the instructions at .NET Core 1.0 – RC4 Download.

The release is also available as Docker images, in the dotnet repo. The following SDK images are now available:

  • 1.0.3-sdk-msbuild-rc4
  • 1.0.3-sdk-msbuild-rc4-nanoserver
  • 1.1.0-sdk-msbuild-rc4
  • 1.1.0-sdk-msbuild-rc4-nanoserver

The aspnetcore-build repo has also been updated.

Changes to Docker Images

We made an important change with this release to the tags in the dotnet repo. The latest and nanoserver tags now refer to MSBuild SDK images. The latest tag now refers to the same image as 1.1.0-sdk-msbuild-rc4, while nanoserver now refers to the same image as 1.1.0-sdk-msbuild-rc4-nanoserver. Previously, those two tags refered to the same image as 1.1.0-sdk-projectjson-rc3 and 1.1.0-sdk-projectjson-rc3-nanoserver, respectively.

This is a breaking change, since the msbuild SDK is not compatible with the project.json-based SDK. We need to start moving the .NET Core ecosystem to the msbuild SDK, sooner than expected. We had originally planned to make this change at Visual Studio 2017 RTM. The number of times the latest tag is being pulled is growing much faster than we expected, making the break worse with each passing day. As a result, we were compelled to make this change with this release.

You can continue to use the project-json images for now, listed below, to give you more time to transition to the msbuild images (see dotnet migrate). Changing to these, more specific, tags is a one line change in a Dockerfile.

  • 1.1.0-sdk-projectjson-rc3
  • 1.1.0-sdk-projectjson-rc3-nanoserver

Note: We are no longer updating the project.json images, so please do plan your transition to the msbuild images. For example, only the msbuild SDK images will be updated when we release the 1.0.4 and 1.1.1 runtime updates (we expect) later this quarter.

We apologize if you are broken by this change. We will be providing general guidance on how to best use our tags to avoid a similar situation in future. We’ve been learning a lot about Docker over the last several months, particularly around versioning and naming. Expect a blog post soon on this topic that addresses these issues.

Changes to Supported Linux Distros

Fedora 23 and openSUSE 13.2 recently went out of support, per their respective project lifecycle. As a result, we are now no longer supporting or building for Fedora 23 and openSUSE 13.2.

We will be publishing a more formal policy on Linux distro support, in particular on managing end-of-life of distros. There will be opportunity for feedback on the policy before it is finalized.

Project Files

In the RC3 release, we made major improvements to make the csproj project files smaller. If you are using .NET Core project files created with earlier Visual Studio 2017 versions (before RC3), you should read the Updating Project Files section of the RC3 blog post to learn about changes you need to make to your project files.

dotnet new

The dotnet new command is one of the most important parts of the .NET Core tools experiences. It is useful for both new and experienced .NET Core users. I know that people who use and test the product on a daily basis use dotnet new all the time for experiments and prototypes. I do! It’s also documented on a lot of websites and markdown pages to help users get started with .NET Core. That said, we always knew that dotnet new was a little lacking and decided to improve it.

In short, we want dotnet new to have the following characteristics:

  • Powerful — expressive and scriptable command-line syntax.
  • Helpful — an interactive mode helps users pick the templates they need (think Yeoman).
  • Extensible — anyone can write templates for dotnet new!
  • Updatable — templates can be updated outside of primary delivery vehicles (e.g. Visual Studio, .NET Core SDK).
  • Platform — can be used by tools like Visual Studio and generator-aspnet (think yo aspnet).

dotnet new is now based on a new templating engine, which you can check out at dotnet/templating. It already does a great job satisfying what the RC3 version of dotnet new did. We’ll continue to add to it and improve it over the next several months, getting it to the point that it satisfies all the characteristics above. For the immediate term, we’re focussed on ensuring that it has the right quality level for Visual Studio 2017 RTM.

Improvements

We have updated dotnet new in the RC4 release with the following features:

You can now specify a target directory for your new template, with the -o argument, such as in the following example: dotnet new console -o awesome-new-tool. If the target directory does not exist, it will be created for you. This can also be combined with the -n argument to name projects, such as in the following example: dotnet new console -n awesome-new-tool -o src/awesome.

Target frameworks now have their own argument, -f. You can specify a target framework for any template, provided it is a legal value, such as in: dotnet new console -f netcoreapp1.0. The target framework values are the same as the ones used in the project files.

Solution file management has been improved. You can now create an empty solution file with dotnet new sln and then add projects to it. You can create solution files before or after project files, depending on your preferred workflow. If you have been using the older project.json-based tooling, you can think of solution files as the replacement for global.json files.

Important Changes

The basic dotnet new (no arguments) experience no longer default to creating a console template, as it did in RC3 and earlier releases. The dotnet new command will now print the available set of templates, much like dotnet new --help. In a later release, we may update dotnet new to start an interactive new template experience, which helps you select the right template based on a series of questions.

The new commandline has been streamlined. To create templates, you type dotnet new console or dotnet new web for console app or MVC templates respectively. The RC3 and earlier tools versions required a -t argument before the template name, such as dotnet new -t web.

Some of the template names changed, specifically Lib (now classlib) and Xunittest (now xunit). For RC4, you will need to use the new template names.

Walkthrough of the new template experience

You are probably curious about the new dotnet new experience. Sayed Hashimi, the Program Manager for dotnet new, wrote the following walkthrough to give you a good idea of what to expect. That said, I encourage you to install the RC4 SDK and try it out for yourself.

Sayed’s walkthrough was done on Linux. You can replicate the same experience on Windows. Just make sure to replace the Linux commands with the ones you are using in your favorite Windows shell.

Getting familiar with the new new

First let’s get a little familiar with new by displaying the help using the dotnet new --help. The result is shown
below.

$ dotnet new --helpTemplate Instantiation Commands for .NET Core CLI.Usage: dotnet new [arguments] [options]Arguments:  template  The template to instantiate.Options:  -l|--list         List templates containing the specified name.  -lang|--language  Specifies the language of the template to create  -n|--name         The name for the output being created. If no name is specified, the name of the current directory is used.  -o|--output       Location to place the generated output.  -h|--help         Displays help for this command.  -all|--show-all   Shows all templatesTemplates                                 Short Name      Language      Tags          --------------------------------------------------------------------------------------Console Application                       console         [C#], F#      Common/ConsoleClass library                             classlib        [C#], F#      Common/LibraryUnit Test Project                         mstest          [C#], F#      Test/MSTest   xUnit Test Project                        xunit           [C#], F#      Test/xUnit    Empty ASP.NET Core Web Application        web             [C#]          Web/Empty     MVC ASP.NET Core Web Application          mvc             [C#], F#      Web/MVC       Web API ASP.NET Core Web Application      webapi          [C#]          Web/WebAPI    Solution File                             sln                           Solution      Examples:    dotnet new mvc --auth None --framework netcoreapp1.0    dotnet new mstest --framework netcoreapp1.0    dotnet new --help

From the help output we can see that to create a project we can execute dotnet new . The template names
are displayed in the results of --help but you can also get the names using dotnet new -l.

Creating Projects

Let’s create a new HelloWorld console app. The most basic way to create a console app is using the
command dotnet new console. The other parameters that we can specify are listed below.

  • -n|--name
  • -o|--output
  • -lang|--language

In this case we want to create a C# console app named HelloWorld in the src/HelloWorld directory. Since C# is the
default language for the console app template (default value indicated in help by [ ]) there is no need to pass a
value to -l. To create the project execute dotnet new console -n HelloWorld -o src/HelloWorld. The result is
shown below.

$ dotnet new console -n HelloWorld -o src/HelloWorldContent generation time: 32.4513 msThe template "Console Application" created successfully.

Let’s see what was generated by listing the files on disk.

$ ls -Rsrc./src:HelloWorld./src/HelloWorld:HelloWorld.csproj   Program.cs

The HelloWorld project was created as expected in src/HelloWorld, and it consists of two files HelloWorld.csproj
and Program.cs. Let’s restore the packages and run the app using dotnet restore and then dotnet run. See the result.

$ cd src/HelloWorld/
$ dotnet restore  Restoring packages for /Users/sayedhashimi/temp/blog/samples/src/HelloWorld/HelloWorld.csproj...  Generating MSBuild file /Users/sayedhashimi/temp/blog/samples/src/HelloWorld/obj/HelloWorld.csproj.nuget.g.props.  Generating MSBuild file /Users/sayedhashimi/temp/blog/samples/src/HelloWorld/obj/HelloWorld.csproj.nuget.g.targets.  Writing lock file to disk. Path: /Users/sayedhashimi/temp/blog/samples/src/HelloWorld/obj/project.assets.json  Restore completed in 953.36 ms for /Users/sayedhashimi/temp/blog/samples/src/HelloWorld/HelloWorld.csproj.  NuGet Config files used:      /Users/sayedhashimi/.nuget/NuGet/NuGet.Config  Feeds used:      https://api.nuget.org/v3/index.json
$ dotnet runHello World!

From the output we can see the packages were restored successfully and when the app was executed Hello World! was
printed to the console.

Templates with Options

Templates can expose options, which customize template output based on user input. We can see those options by calling --help on a template, such as with dotnet new mvc --help.

$ dotnet new mvc --helpTemplate Instantiation Commands for .NET Core CLI.Usage: dotnet new [arguments] [options]Arguments:  template  The template to instantiate.Options:  -l|--list         List templates containing the specified name.  -lang|--language  Specifies the language of the template to create  -n|--name         The name for the output being created. If no name is specified, the name of the current directory is used.  -o|--output       Location to place the generated output.  -h|--help         Displays help for this command.  -all|--show-all   Shows all templatesMVC ASP.NET Core Web Application (C#)Author: MicrosoftOptions:                                                                -au|--auth           The type of authentication to use                                         None          - No authentication                                     Individual    - Individual authentication                         Default: None                                    -uld|--use-local-db  Whether or not to use LocalDB instead of SQLite                       bool - Optional                                                       Default: false                                   -f|--framework                                                                                 netcoreapp1.0    - Target netcoreapp1.0                               netcoreapp1.1    - Target netcoreapp1.1                           Default: netcoreapp1.0                         

Here we can see that the mvc template has three specific parameters. In this case let’s create an mvc app named
MyWeb in the src/MyWeb directory targetting netcoreapp1.1. To do that we will execute
dotnet new mvc -n MyWeb -o src/MyWeb -au Individual -f netcoreapp1.1.

$ dotnet new mvc -n MyWeb -o src/MyWeb -au Individual -f netcoreapp1.1
Content generation time: 429.6003 ms
The template "MVC ASP.NET Web Application" created successfully.

Now the project has been created in the src/MyWeb directory. Let’s take a look.

$ ls -lp src/MyWeb/
total 80
drwxr-xr-x  5 sayedhashimi  staff   170 Feb  3 10:43 Controllers/
drwxr-xr-x  4 sayedhashimi  staff   136 Feb  3 10:43 Data/
drwxr-xr-x  5 sayedhashimi  staff   170 Feb  3 10:43 Models/
-rwxr--r--  1 sayedhashimi  staff  1767 Feb  3 10:43 MyWeb.csproj
-rwxr--r--  1 sayedhashimi  staff  4096 Feb  3 10:43 MyWeb.db
-rwxr--r--  1 sayedhashimi  staff   544 Feb  3 10:43 Program.cs
drwxr-xr-x  5 sayedhashimi  staff   170 Feb  3 10:43 Services/
-rwxr--r--  1 sayedhashimi  staff  3081 Feb  3 10:43 Startup.cs
drwxr-xr-x  8 sayedhashimi  staff   272 Feb  3 10:43 Views/
-rwxr--r--  1 sayedhashimi  staff   168 Feb  3 10:43 appsettings.Development.json
-rwxr--r--  1 sayedhashimi  staff   185 Feb  3 10:43 appsettings.json
-rwxr--r--  1 sayedhashimi  staff   197 Feb  3 10:43 bower.json
-rwxr--r--  1 sayedhashimi  staff   604 Feb  3 10:43 bundleconfig.json
-rwxr--r--  1 sayedhashimi  staff    61 Feb  3 10:43 runtimeconfig.template.json
-rwxr--r--  1 sayedhashimi  staff   680 Feb  3 10:43 web.config
drwxr-xr-x  8 sayedhashimi  staff   272 Feb  3 10:54 wwwroot/

Future Plans

We want to enable everyone to create templates and make it easy to share those templates. Templates will be installable as NuGet packages or a folder. In the mean time checkout out the
templating wiki for info on creating templates. I was happy to see the
Custom project templates using dotnet new
post by one of the community members that we’ve been working with for early feedback. Here’s my favorite quote
from his post

“This new method makes creating project templates about as easy as it’s ever going to get and allows
really easy sharing, versioning and personalization of project templates.”.

We are also working on a way to enable templates to be updated. For critical fixes we are considering updating templates without any user interaction. For general updates we are looking to add a new --update option.

We are working on plans to integrate the templating engine with the Visual Studio family of IDEs and other template experiences, such as Yeoman. We have a vision of everyone producing templates in a single format that works with all .NET tools. Wouldn’t that be nice!?! If you’re interested in learning more about how yo aspnet relates to dotnet new see my comments on the topic.

Last, we’re hoping to update the command line experience to be interactive. In this mode we will
prompt for things like the template name, project name and the other information that you otherwise need to provide as command line arguments. We believe that interactive is the ultimate new user experience.

Summary

I’ve been asked several times recently when the .NET Core Tools will ship a final RTM release. The tools will ship as an RTM release the same day as Visual Studio 2017 RTM. We’re getting close. As I said at the start of the post, we’ve got a few more bugs to squash first and then we’ll be happy to get the release out the door for you to use.

In this release, we’ve focussed on quality improvements. We also switched over to a new and more capable templating engine. In this release, the new dotnet new implementation is largely a replacement of the functionality that was included in the RC3 release. In upcoming releases, you should expect to see some great new features that make you more productive at the command line. We hope to integrate this new system into Visual Studio, too, enabling us (and you!) to share templates across all .NET Core tools.

Thanks to Sayed Hashimi for the write-up on the new dotnet new implementation!

As always, please shared your feedback, either in the comments, in email or twitter.

Thanks!

More on GVFS

$
0
0

After watching a couple of days of GVFS conversation, I want to add a few things.

What problems are we solving?

GVFS (and the related Git optimizations) really solves 4 distinct problems:

  1. A large number of files – Git doesn’t naturally work well with hundreds of thousands or millions of files in your working set.  We’ve optimized it so that operations like git status are reasonable, commit is fast, push and pull are comfortable, etc.
  2. A large number of users – Lots of users create 2 pretty direct challenges.
    1. Lots of branches – Users of Git create branches pretty prolifically.  It’s not uncommon for an engineer to build up ~20 branches over time and multiply 20 by, say 5000 engineers and that’s 100,000 branches.  Git just won’t be usable.  To solve this, we built a feature we call “limited refs” into our Git service (Team Services and TFS) that will cause the service to pretend that only the branches “you care about” are projected to your Git client.  You can favorite the branches you want and Git will be happy.
    2. Lots of pushes – Lots of people means lots of code flowing into the server.  Git has critical serialization points that will cause a queue to back up badly.  Again, we did a bunch of work on our servers to handle the serialized index file updates in a way that causes very little contention.
  3. Big files – Big binary files are a problem in Git are problem because Git copies all the versions to your local Git repo and makes for very slow operations.  GVFS’s virtualized .git directory means it only pulls down the files you need when you need them.
  4. Big .git folder – This one isn’t exactly distinct.  It is related to a large number of files and big files but, just generally the multiplication of lots of files, lots of history and lots of binary files creates a huge and unmanageable .git directory that gobbles up your local storage and slows everything down.  Again GVFS’s virtualization only pulls down the content you need, when you need it, making it much smaller and faster.

There are other partial solutions to some of these problems – like LFS, sparse checkouts, etc.  We’ve tackled all of these problems in an elegant and seamless way.  It turns out #2 is solved purely on the server – it doesn’t require GVFS and will work with any Git client.  #1, #3 and #4 are addressed by GVFS.

GVFS really is just Git

One of the other things I’ve seen in the discussions is how we are turning Git into a centralized version control system (and hence removing all the goodness).  I want to be clear that I really don’t believe we are doing that and would appreciate the opportunity to convince you.

Looking at the server from the client, it’s just Git.  All TFS and Team Services hosted repos are *just* Git repos.  Same protocols.  Every Git client that I know of in the world works against them.  You can choose to use the GVFS client or not.  It’s your choice.  It’s just Git.  If you are happy with your repo performance, don’t use GVFS.  If your repo is big and feeling slow, GVFS can save you.

Looking at the GVFS client, it’s also “just Git” with a few exceptions.  It preserves all of the semantics of Git – The version graph is a Git version graph.  The branching model is the Git branching model.  All the normal Git commands work.  For all intents and purposes you can’t tell it’s not Git.  There are two exceptions.

  1. GVFS only works against TFS and Team Services hosted repos.  The server must have some additional protocol support to work with GVFS.  Also, the server must be optimized for large repos or you aren’t likely to be happy.  We hope this won’t remain the case indefinitely.  We’ve published everything a Git server provider would need to implement GVFS support.
  2. GVFS doesn’t support Git filters.  Git filters transform file content on the fly during a retrieval (like end of line translations).  Because GVFS is projecting files into the file system, we can’t transform the file on “file open”.
  3. GVFS has limits on going offline.  In short, you can’t do an offline operation if you don’t have the content it needs.  However, if you do have the content, you can go offline and everything with work fine (commits, branches, everything).  In the extreme case, you could pre-fetch everything and then every operation would just work – but that would kind of defeat virtualization.  In a more practical case, you could just pre-fetch the content of the folders you generally use and leave off the stuff you don’t.  We haven’t built tools yet to manage your locally cached state but there’s no reason we (or you) can’t.  With proper management of pre-fetching GVFS can even give a great, full featured offline experience.

That’s all I know of.  Hopefully, if GVFS takes off, #1 will go away.  But remember, if you have a repo in GVFS and you want to push to another Git server, that’s fine.  Clone it again without the GVFS client, add a remote to the alternate Git server and push.  That will work fine (ignoring the fact that it might be slow because it’s big).  My point is, you are never locked in.  And #3 can be improved with fairly straight forward tooling.  It’s just Git.

Hopefully this sheds a little more light on the details of what we’ve done.  Of course, all the client code is in our GitHub project so feel free validate my assertions.

Thanks,

Brian

Our team has acquired the extension, ‘Wiki’ by Agile Extensions, and plan to provide a built-in Wiki experience

$
0
0

wiki-banner

We’re firm believers that a vibrant extension ecosystem is critical to having a best-in-class DevOps product. Not only does it organically bring new technologies and solutions that our customers are looking for, it also builds a community around our product which is a huge contributor to the success of any platform.

A recently example of that partnership and success is the Wiki extension which has done well in the Marketplace. We have been exploring different ways to bring more ‘social’ experiences to Team Services and it became clear that a Wiki , which has been on the team’s backlog for quite some time, is going to be necessary to fully realize that vision.

Today, we’re excited to announce that we have acquired the Marketplace extension, ‘Wiki’, from Agile Extensions.

This decision is driven by a long-term plan to offer a built-in Wiki experience for Team Services instead of requiring people to install an extension from the Marketplace. To achieve this we will leverage the existing extension as a starting point, and deprecate the Marketplace listing once we’re ready to release our built-in solution. Acquiring the extension is our first step and I’m including some additional thoughts below. If I’ve left anything out, or if you have any unanswered questions, don’t hesitate to leave a comment here or tweet me @JoeB_in_NC

I am a ‘Wiki’ customer, how does this impact me?

In the short term, this acquisition will not impact you and there is no action required. The extension will show up as being published by a different publisher, ‘Microsoft DevLabs’ instead of ‘Agile Extensions’, but you’ll be able to continue using it without disruption. Long-term, we will deprecate the Marketplace listing and give customers an option to migrate their Wiki content to the built-in experience.

Why not build your own and keep other Wiki experiences in the Marketplace?

Our publishers are an extended member of our team, their success is our success and vice versa. In the case of Wiki, our team had been working closely with Agile Extensions on their experience. During that process is when our team realized we needed a built-in solution. Agile Extensions had made such a good start with their extension that it made the most sense to purchase what they had already created especially after having worked so closely with them.

Do you have more information to share about your built-in Wiki plans?

We don’t have a specific timeline to share with you but the team is busy preparing designs and determining the direction we want to go with Wiki. One of the biggest inputs to that process is the set of feedback that the current Wiki extension has received. We’ve collected all of that and Agile Extensions has been very supportive in ensuring this information is not lost. Some of the top of mind items we’ve heard from Wiki users that we’ll consider are:

  1. Welcome page: there is an inherent desire to have a customized home page that teams can author
  2. Navigation: As Wikis grow, finding the right page becomes a pain so we’ll need to land this properly
  3. Browsing: Ordering a specific page is desirable but easier, and authoring an ordered list of pages is painful and very valuable
  4. Authors: Pages are authored not only by developers but also by content writers and project managers there having a simple authoring mechanism is key

These are items we know come from the feedback, but we’re always listening. Please, if you have additional thoughts about Wiki and what you want to see in Team Services then leave a comment!

A Wiki for Team Services and TFS

$
0
0

One of the big areas of investment for us recently is “social” experiences.  I’m using a fairly broad definition of that term, including a focus on “me” and my stuff and capabilities that improve collaboration across my team, project, organization.  You’ve already seen lots of pieces that support this general direction:

  • The new account pages with “me” views of work, favorites, pull requests, etc.
  • A new project landing page optimized for exploring projects across your organization.
  • To some degree, the new navigation experience that cleans things up and gives us some room to continue to grow new experiences.
  • Follow and favorites experiences and significantly improved notifications.
  • The beginnings of mobile experiences so you get a great experience regardless of the device you are on.
  • Continued investment in pull requests, policies, etc.
  • Code and work item search capabilities to make it easy to find what I’m looking for.

All of these are steps on a journey to an improved collaboration experience.

As part of our work, we concluded we need to have a pretty full-featured Wiki experience to enable all the collaboration experiences we feel are needed.  Fortunately, one of our partners was already building a Wiki extension and had published it in the VS Marketplace.  We decided to purchase the extension and use it as a stepping stone to get to where we want to be with Wikis.  We’re just getting going on it now and it will probably take a couple of sprints for us to get our feet under us and start producing a preview of the “V2” of that extension.  For now, you can use the current extension in the marketplace.  It’s not complete but it’s reasonably functional.  We’ve changed the publisher to Microsoft DevLabs for now to represent the transitional state.

As we build it out, we’ll make sure it works both on Team Services and Team Foundation Server.  I hate to comment on the business model at this early of a stage but we think we’ll just include it as part of your base TFS/Team Services license – so we don’t expect there will be any additional charge.  In fact, I think it won’t even be in the marketplace when we are done – it will just be built it.  Some of these things we need to settle for sure but, at a high level, we think of Wiki as a fundamental part of the experience and don’t want to have it not be available for some people.

Stay tuned for more in the next few months.  Also, watch for a feature timeline update early next month to see more of the “social” work on our roadmap.

You can read more if you are interested on our official Wiki announcement blog post.

Brian

Targeting the Windows Subsystem for Linux from Visual Studio

$
0
0

The Windows Subsystem for Linux (WSL) was first introduced at Build in 2016 and was delivered as an early beta in Windows 10 Anniversary Update. Since then, the WSL team has been hard at work, dramatically improving WSL’s abilty to run an ever increasing number of native Linux command-line binaries and tools, including most mainstream developer tools, platforms and languages, and many daemons/services* including MySQL, Apache, and SSH.

With the Linux development with C++ workload in Visual Studio 2017 you can use the full power of Visual Studio for your C/C++ Linux development. Because WSL is just another Linux system, you can target it from Visual Studio by following our guide on using the Linux workload.  This gives you a lot of flexibility to keep your entire development cycle locally on your development machine without needing the complexity of a separate VM or machine. It is, however, worth covering how to configure SSH on Bash/WSL in a bit more detail.

Install WSL

If you’ve not already done so, you’ll first need to enable developer mode and install WSL itself. This only takes a few seconds, but does require a reboot.

When you run Bash for the first time, you’ll need to follow the on-screen instructions to accept Canonical’s license, download the Ubuntu image, and install it on your machine. You’ll then need to choose a UNIX username and password. This needn’t be the same as your Windows login username and password if you prefer. You’ll only need to enter the UNIX username and password in the future when you use sudo to elevate a command, or to login “remotely” (see below).

Setting up WSL

Now you’ll have a vanilla Ubuntu instance on your machine within which you can run any ELF-64 Linux binary, including those that you download using apt-get!

Before we continue, let’s install the build_essentials package so you have some key developer tools including the GNU C++ compiler, linker, etc.:

$ sudo apt install -y build_essential

Install & configure SSH

Let’s use the ‘apt’ package manager to download and install SSH on Bash/WSL:

$ sudo apt install -y openssh-server

Before we start SSH, you will need to configure SSH, but you only need to do this once. Run the following commands to edit the sshd config file:

$ sudo nano /etc/ssh/sshd_config

Scroll down the “PasswordAuthentication” setting and make sure it’s set to “yes”:

Editing sshd_config in nano

Hit CTRL + X to exit, then Y to save.

Now generate SSH keys for the SSH instance:

$ sudo ssh-keygen -A

Start SSH before connecting from Visual Studio:

$ sudo service ssh start

*Note: You will need to do this every time you start your first Bash console. As a precaution, WSL currently tears-down all Linux processes when you close your last Bash console!.

Install & configure Visual Studio

For the best experience, we recommend installing Visual Studio 2017 RC (or later) to use Visual C++ for Linux. Be sure to select the Visual C++ for Linux workload during the installation process.

Visual Studio installer with Linux C++ workload

Now you can connect to the Windows Subsystem for Linux from Visual Studio by going to Tools > Options > Cross Platform >Connection Manager. Click add and enter “localhost” for the hostname and your WSL user/password.

VS Connection Manager with WSL

Now you can use this connection with any of your existing C++ Linux projects or create a new Linux project under File > New Project > Visual C++ > Cross Platform > Linux.

In the future, we’ll publish a more detailed post showing the advantages of working with WSL, particularly leveraging the compatibility of binaries built using the Linux workload to deploy on remote Linux systems.

For now, now that, starting with Windows 10 Creators Update, Bash on the Windows Subsystem for Linux (Bash/WSL) is a real Linux system from the perspective of Visual Studio.

Trying out "dotnet new" template updates and csproj with VS2017

$
0
0

I updated my Visual Studio 2017 RC installation today. Here's the release notes. You just run "Visual Studio Installer" if you've already got a version installed and it updates. The updating processes reminds me a little of how Office 365 updates itself. It's not as scary as VS updates of the past. You can download the VS2017 RC at https://www.visualstudio.com and it works side by side with your existing installs. I haven't had any issues yet.

New Templating Engine for .NET Core CLI

It also added/updated a new .NET Core SDK. I am a fan of the command line "dotnet.exe" tooling and I've been pushing for improvements in that experience. A bunch of stuff landed in this update that I've been waiting for. Here's dotnet new:

C:\Users\scott\Desktop\poop> dotnet new
Template Instantiation Commands for .NET Core CLI.

Usage: dotnet new [arguments] [options]

Arguments:
template The template to instantiate.

Options:
-l|--list List templates containing the specified name.
-lang|--language Specifies the language of the template to create
-n|--name The name for the output being created. If no name is specified, the name of the current directory is used.
-o|--output Location to place the generated output.
-h|--help Displays help for this command.
-all|--show-all Shows all templates


Templates Short Name Language Tags
--------------------------------------------------------------------------------------
Console Application console [C#], F# Common/Console
Class library classlib [C#], F# Common/Library
Unit Test Project mstest [C#], F# Test/MSTest
xUnit Test Project xunit [C#], F# Test/xUnit
Empty ASP.NET Core Web Application web [C#] Web/Empty
MVC ASP.NET Core Web Application mvc [C#], F# Web/MVC
Web API ASP.NET Core Web Application webapi [C#] Web/WebAPI
Solution File sln Solution

Examples:
dotnet new mvc --auth None --framework netcoreapp1.0
dotnet new console --framework netcoreapp1.0
dotnet new --help

There is a whole new templating engine now. The code is here https://github.com/dotnet/templating and you can read about how to make your own templates or on the wiki.

I did a "dotnet new xunit" and it made the csproj file and a Unit Test. Here's what's inside the csproj:

Exenetcoreapp1.0

That's not too bad. Here's a a library with no references:

netstandard1.4

Note there's no GUIDs in the csproj. Sweet.

Remember also that there was talk that you wouldn't have to edit your csproj manually? Check this out:

C:\Users\scott\Desktop\poop\lib> dotnet add package Newtonsoft.Json
Microsoft (R) Build Engine version 15.1.545.13942
Copyright (C) Microsoft Corporation. All rights reserved.

Writing C:\Users\scott\AppData\Local\Temp\tmpBA1D.tmp
info : Adding PackageReference for package 'Newtonsoft.Json' into project 'C:\Users\scott\Desktop\poop\lib\lib.csproj'.
log : Restoring packages for C:\Users\scott\Desktop\poop\lib\lib.csproj...
info : GET https://api.nuget.org/v3-flatcontainer/newtonsoft.json/index.json
info : OK https://api.nuget.org/v3-flatcontainer/newtonsoft.json/index.json 1209ms
info : GET https://api.nuget.org/v3-flatcontainer/newtonsoft.json/9.0.1/newtonsoft.json.9.0.1.nupkg
info : OK https://api.nuget.org/v3-flatcontainer/newtonsoft.json/9.0.1/newtonsoft.json.9.0.1.nupkg 181ms
info : Package 'Newtonsoft.Json' is compatible with all the specified frameworks in project 'C:\Users\scott\Desktop\poop\lib\lib.csproj'.
info : PackageReference for package 'Newtonsoft.Json' version '9.0.1' added to file 'C:\Users\scott\Desktop\poop\lib\lib.csproj'.

Doing "dotnet add package foo.bar" automatically gets the package from NuGet and adds it to your csproj. Just like doing "Add NuGet Package" (or add reference) in Visual Studio. You don't even have to open or look at your csproj.

I'm going to keep digging into this. We're getting into a nice place where someone could easily make a custom template then "nuget in" that templates then "File | New | Your Company's Template" without needed yeoman, etc.

Please shared your feedback:

Also, be sure to check out the new and growing Docs site at https://docs.microsoft.com/en-us/dotnet


Sponsor: Track every change to your database. See who made changes, what they did, & why, with SQL Source Control. Get a full version history in your source control system. See how with Red Gate's SQL Source Control.



© 2016 Scott Hanselman. All rights reserved.
     

55 countries with Real Time Traffic in Bing Maps

$
0
0

The Bing Maps team is happy to announce the real-time traffic flow data is now available in 55 countries. Bing Maps uses traffic flow data in two ways. The first is to provide real-time and predictive route calculations. The second method is a traffic overlay of color coded roads on the map to indicate the real-time flow of traffic. All of the interactive Bing Maps controls provide an option to overlay real-time traffic flow data on top of the map. The Bing Maps Version 8 Web Control (v8) provides this functionality through the Traffic module.



Point based traffic incident data such as car accidents and construction is available in 35 countries. In addition to accessing traffic incident data though the interactive Bing Maps controls, this data can also be accessed directly Bing Maps REST Services Traffic API, and through the Traffic Data Source in the Bing Spatial Data Services.



A complete list of countries in which traffic data is available can be found here.

If you have any questions or feedback about Bing Maps, please let us know on the Bing Maps forums or visit the Bing Maps website to learn more about the Bing Maps platform.

- Bing Maps Team

Related Posts & Resources

Join Us: Visual Studio 2017 Launch Event and 20th Anniversary

$
0
0

Twenty-five years ago, I started my first day at Microsoft as a developer on the Access team, and then as a developer on a newly created product – Visual InterDev. I remember how the emphasis was on the Visual partof our various product offerings, we have come a long way to the Visual Studio we have now.

Today, I’m proud and humbled that Visual Studio is turning twenty – we’re celebrating two decades of Visual Studio! As we hit this great milestone, I’m also excited to announce that Visual Studio 2017 will be released on March 7.

As part of the team that created the first version of Visual Studio, it was an ambitious goal to bring together everything developers needed to build applications for the client, the server, and the web. Twenty years ago, on January 28, 1997, we announced that we were going to launch Visual Studio 97 – a single product that would bring together best-of-breed productivity tools for any developer. This was no trivial undertaking. It was a challenging task to bring Visual Basic, Visual C++, Visual J++, Visual FoxPro, and Visual InterDev into one single development environment. The team delivered, kicking off decades of incredible productivity for millions of developers worldwide.

Over the years, Visual Studio grew from an IDE to a suite of products and services, including Visual Studio Team Services, Visual Studio Code, and many others. The family of Visual Studio products extends across platforms, enabling developers to build mobile-first, cloud-first apps that span Android, iOS, Linux, MacOS, and Windows. It also offers industry-leading DevOps practices across all types of projects, as well as tight integration with the Azure cloud.

On March 7, we are proud to bring you our newest release, Visual Studio 2017, with a livestreamed two-day launch event at https://launch.visualstudio.com. Brian Harry, Miguel de Icaza, and Scott Hanselman will join me on stage to share the latest innovations from Visual Studio, .NET, Xamarin, Azure, and more. You will have the opportunity to engage in demo packed sessions focusing on key improvements within the product. To help you get started, on March 8, we will also bring you a full-day of live training with multiple topics to choose from. Save the date.

Whether you’re new to Visual Studio or have been with us on this journey, we want to hear and share your story. Grab your phone and take a short video to tell us a little about your Visual Studio journey:

  • How long have you been using Visual Studio?
  • What is the coolest software you’ve built?
  • What do you like about Visual Studio?
  • How about birthday wishes? How would you say, “Happy Birthday, Visual Studio” in your native language?

Check out this example from Sara Ford:

What memorabilia have you collected over the years? Is it a sticker, t-shirt, mug, poster, button, or something else? Share a photo or a short video clip on Instagram or post your story on Twitter and Facebook using the hashtag #MyVSstory.

I look forward to hearing your stories!

Julia Liuson, Corporate Vice President, Visual Studio

Julia is responsible for developer tools and services, including the programming languages and runtimes designed for a broad base of software developers and development teams as well as for Visual Studio, Visual Studio Code, and the .NET Framework lines of products and services. Julia joined Microsoft in 1992, and has held a variety of technical and management positions while at Microsoft, including the General Manager for Visual Studio Business Applications, the General Manager for Server and Tools in Shanghai, and the development manager for Visual Basic.

Introducing Visually Rich and Highly Informative Weather Answers

$
0
0
Weather forecasts determine what we wear, when and where we take vacations, and can even affect our moods. Winter is an especially critical time for checking weather reports, especially if you’re the type that likes to hit the slopes. That’s why Bing has just released two new experiences designed to help you navigate every day weather, as well as check the latest snow fall on your favorite ski resort.
 
Enhanced Weather Experience
 
We refreshed the weather experience on Bing to help you plan your days using visually rich forecast information. Now, when you search for the weather in your city (for example, New York), Bing provides an animated experience using real-time forecasts. You can view the forecast by the hour using the interactive slider. Move the slider and the background animation and forecast data update to match the time of day, helping you to plan your day with confidence.
 
For major cities across the world, you’ll see the cityscapes in the background that adjust based on the time of day. You can see the Eiffel Tower light up Paris at night, or the sun glint off the Chicago River.  For other cities across the world we provide a color based gradient that indicates the weather conditions throughout the day. And we’ll continue to provide alerts during extreme weather conditions. 
 
To find weather information, type questions on Bing using natural language such as, “will it rain tomorrow” or “how is the weather this Friday?” You can even ask Bing follow-up questions. If you ask, “how is the weather in Seattle?” followed by “how about tomorrow?” Bing understands you are asking about tomorrow’s weather information for Seattle. It’s all about adapting to the way people speak, making it easier for you to get the information you need, quickly. 
 
 
how cold is this weekend in New York  how is the weather in Burbank
 
Ski Resort Snow Reports
 
If you ski or snowboard, you’re likely used to checking mountain resort conditions in advance of your trip. This means searching for the resort, going to the site, and clicking on several links. To help you get information on your favorite ski resort more quickly, Bing now provides the latest snow conditions and snow forecasts for ski resorts in the United States. Just search for a ski resort  and you’ll discover the snow surface condition, snow depth, past and predicted snow fall, weather forecast, links to live web cams, when available,  and latest status of the lifts and trails in the ski resort. 
 
stevens pass ski conditionsstevens pass
 

We hope these new weather experiences help you jump start your day or plan the perfect snow getaway. We’d love to hear your ideas for even more ways we can enhance Bing. To share your ideas, go to Bing Listens.

-The Bing Team




 

Using US Census data with Bing Maps

$
0
0

While creating the code samples for the Bing Maps V8 web control, we wanted to make the samples more realistic and use real data. As such, many of the Bing Maps V8 interactive code samples use data from a number of sources such as earthquake data from the USGS, satellite imagery of hurricanes from NASA, weather radar data from Iowa State University, and 2010 US Census data from the US Census Bureau.

To make the census data easy to integrate with Bing Maps applications, a subset of the data was uploaded into the Bing Spatial Data Services. This data has been exposed through 4 different data sources, each containing a census data based on a different type of geographical region; states, counties, ZCTA5 (Zip code tabulation area), and the 111th Congressional districts. The Bing Maps team has now made these data sources publicly available in the Bing Spatial Data Services so that you can easily use them in your application. Documentation on these data sources can be found here.

Creating a Census Choropleth map in Bing Maps V8

The following code sample shows how to create a Choropleth map (color coded boundary map) based on the population by state. This code sample uses the Bing Spatial Data Services module that is built into Bing Maps V8 to query the state level 2010 US Census data. This code sample also creates a legend with a gradient scalebar, which is used to color code the boundary data based on its relative population.




   
   
                src='http://www.bing.com/api/maps/mapcontrol?callback=GetMap'
            async defer>
   
   


   

       

       


           
            0
            10,000,000
       

   


Running this code will display a map of the USA with color coded US states based on population. A legend for the colors is overlaid on top the map as well. If you click on any of the states, a notification will appear which specifies which state was clicked and its population.

Try it now

Access additional US Census Data

The 2010 US Census data sources that are made available in the Bing Spatial Data Sources contain a subset of the data collected by the US Census, primarily population data. The US Census Bureau captures a lot of additional data which has not been included in the newly released data sources in the Bing Spatial Data Services. If you would like to access this data, there are a few options. One option is to download some of the existing geographic data sets from the US Census and upload them into the Bing Spatial Data Service. You can easily upload ESRI Shapefiles and KML files into the Bing Spatial Data Services to create a data source. Alternatively, you can upload this data into a spatial database such as SQL Azure and then expose the data through a custom web service.

Partner Solutions

Don’t want to develop a custom application yourself, take a look at one of these solutions.

Microsoft Power BI

Power BI along with the mapping functionality available in Excel make it easy to visualize data on Bing Maps. Here are a few useful blog posts on how to do this:

CensusViewer by MoonShadow Mobile

CensusViewer is an online application built by Moonshadow Mobile. This application gives you access to the 2010 and 2000 Census "Summary File 1" data from the U.S. Census Bureau as well as the selected American Community Survey (ACS) data and extensive data including registered voters and frequently updated commercial data sources. It is an excellent online tool for demographic analysis of the U.S. population. With CensusViewer you can navigate the census data from within the familiar Bing Maps interface coupled with Moonshadow’s cutting-edge database technology, to provide an intuitive platform for accessing and analyzing the data. Now, there are a couple different versions which give you access to different levels of data. If you want to try it out yourself, you’re in luck because there’s a free version. Here is a heat map created using this tool of the US population that is at or below the poverty level.

EasyTerritory.com

EasyTerritory is a leading map-based solution for territory management and geospatial business intelligence for Microsoft Dynamics CRM or SQL Server. Powered by Bing Maps for Enterprise, EasyTerritory allows users to geographically build and manage territories and get business-intel for leads, opportunities, contacts, accounts or any custom Dynamics CRM entity. EasyTerritory can optionally be deployed without Dynamics CRM using only SQL Server 2008, 2012, 2014, or SQL Azure. Features of EasyTerritory include, territory management, geospatial BI, US Census data sets, route planning and full legacy GIS integration. Out-of-the-box, this solution includes worldwide political boundary data as well as demographic for the US, Canada and parts of Europe. The EasyTerritory solution is available as an online service or can be deployed on-premises.

Bing Predicts the GRAMMY Awards

$
0
0
This Sunday, February 12, music fans will huddle around their TVs and phones to see who will take home a coveted GRAMMY® Award. To see the full list of this year’s nominees, along with Bing’s predictions for who will win in each category, go to Bing and search for GRAMMYs.
 
If you’re a fan of Adele, you’ll be happy to see Bing is predicting she’ll edge out the competition to take home four GRAMMYs, including Record of the Year, Song of the Year, Pop Vocal of the Year, and Pop Solo Performance. Beyoncé, Blackstar and Work are also expected to do well, with Bing predicting each artist to take home multiple awards. And for those cheering on the new artist nominees, Bing predicts it’ll be a close race between Chance the Rapper and Maren Morris!


 
If you’re curious how this year’s list of nominees compares to last year’s winners, just use the link in the carousel titled, “2016 Winners” and you’ll see a carousel of winners.

 
We wish the best of luck to all nominees and hope you have fun cheering on your favorite artists. Regardless of who wins, it’s sure to be a great night for music!
 
-The Bing Team
 
Grammy(s)® is the trademark of The Recording Academy



 

Evolving the Visual Studio Test Platform – Part 4: Together, in the Open

$
0
0

[This is the 4th post in a 4-part series on evolving the Visual Studio Test Platform. You can read the earlier parts here:
Evolving the Visual Studio Test Platform – Part 3,
Evolving the Visual Studio Test Platform – Part 2,
Evolving the Visual Studio Test Platform – Part 1]

The Test Platform is where it is at thanks to its community – a community of adapter writers, test framework writers, extension writers, and application developers, working on platforms ranging from .NET to C++ to JavaScript. The Test Platform has grown to serve a diverse and complex range of lifecycle-requirements and is now at a point where it is vital to enable this community to define and shape its evolution.

Open Sourcing of the Test Platform

Three weeks ago, we announced open sourcing the Test Platform. We have released the sources under the MIT open source license. We have published the public repositories on GitHub where the project is hosted:
https://github.com/Microsoft/vstest, and
https://github.com/Microsoft/vstest-docs.
The repos include the complete implementation of the test platform. These are fully open and ready to accept contributions. Over the next several weeks and months we will continue to transfer source and documentation into the repositories and likewise make it open for contributions.

What does this open sourcing mean?

This open source announcement means that the community now has a fully supported, fully open source, fully cross-platform Test Platform to power tests – including all elements of the lifecycle from the runner to discover and execute tests, all the way to the extensibility model.

Our “Modernizing the Test Platform” journey

This is not just a one-off event. Rather, it should be seen in the larger context of our steps to modernize the test platform as a whole.

We started by taking a hard look at the MSTest test framework, and how to move that user base forward. Thus, with MSTest V2 we introduced support for ASP.NET Core and .NET Core, added several important and much asked-for features, published the bits to NuGet, updated the Unit Test project templates to reference MSTest V2 starting VS “15” Preview 4, updated the Create Unit Test wizard and the Create IntelliTest wizards to support MSTest V2, and have since been updating the NuGet packages regularly – it’s just about 8 months since launch and we have updated the packaged 8 times already.

We added Parallel Test Execution and several other features spanning the entire lifecycle. We added testing support for .NET Core. The “dotnet test” and the “dotnet vstest” commands are now powered by the test platform.

At one level, this means that you can now carry forward the testing experience and assets that you are used to on the desktop (with vstest.console.exe) – at another level, this means that the underlying testing infrastructure is now cross platform – you can carry forward the testing experience that you were used to on the desktop to .NET Core, and not only on Windows but cross platform too (Linux/Mac).

Collaborative innovation, transparent development

This moment is not only about sharing the sources under some license, either. We will engage in transparent development. Towards that, we will share, make visible, and collaborate on issues, implementation and our roadmap.

What we hope for in return is your continued support and participation in collaborative innovation.

Summary

We look at this step of open sourcing the test platform as a prerequisite to unlock collaborative innovation. It is only through our combined expertise – as a community – that the test platform can be taken to the next level of success and applicability.

Here is looking forward to our collaboration. Come, let us grow the test platform, together, in the open.

Happy 15th Birthday .NET!

$
0
0

WP_20170209_19_12_07_Rich

Today marks the 15th anniversary since .NET debuted to the world. On February 13th, 2002, the first version of .NET was released as part of Visual Studio.NET. It seems just like yesterday when Microsoft was building its “Next Generation Windows Services” and unleashed a new level of productivity with Visual Studio.NET.

Since the beginning, the .NET platform has allowed developers to quickly build and deploy robust applications, starting with Windows desktop and web server applications in 2002. You got an entire managed framework for building distributed Windows applications, ASP.NET was introduced as the next generation Active Server Pages for web-based development, and a new language, C# (pronounced “see sharp” :-)) came to be.

Over the years, .NET and it’s ecosystem has grown and expanded to meet the needs of all kinds of developers and platforms. As the technology landscape has changed, so has .NET. You can build anything with .NET including cross-platform web apps, cloud services, mobile device apps, games and so much more. We have a vibrant open-source community where you can participate in the direction of .NET.

With the release of Visual Studio 2017 coming on March 7th and Visual Studio’s 20th anniversary, .NET Core tools reach 1.0. Tune in March 7th and watch the keynote and live Q&A panels. A few nights ago we got together with the Microsoft Alumni Network and threw a big .NET birthday bash with former .NET team members & rock stars. We caught up with Anders Hejlsberg, father of the C# language, to share some stories and thoughts on .NET, open source, and the release:

Thank you to all the past and present .NET team members, rock stars and the community of passionate developers for building incredible software that has changed our industry. Share your story about .NET and Visual Studio with hash tags #MyVSstory #dotnet. Show your .NET pride with a t-shirt, sticker, or mug. Or bake a cake, eat it with your development team, and share it with us on twitter @dotnet!

Here’s to another 15 years!

– The .NET Team


TFS 2017 Update 1 RC2

$
0
0

Today we released the final release candidate for TFS 2017.1.

This update contains a few new features and a lot of bug fixes.  To my knowledge, we have fixed all the bugs that were reported from RC1.  There are a small handful of bugs left to be fixed and we will be ready to ship the final version of Update 1.  We recently announced that VS 2017 and TFS 2017.1 will be released on March 7th.

Most of the new features in RC 2 are small but nice.  You can read the release notes for details (look for “New in RC2”).  Probably the most frequently requested new addition is breaking up of the Git repository administration permissions so that you can. for instance, give people the permission to create repos without giving them permission to administer everyone else’s.  Check out the release notes though because there are improvements in Pull requests, testing, release management and more.

This release candidate is a “go-live” release, can be used in production environments and will be supported.  It will upgrade seamlessly to the final release.  It also has localized resources for all our supported languages (RC1 was English only).  Please try it out and, if you find any issues, let us know immediately.  We still have a little time left to fix any serious problems.

Thank you,

Brian

 

 

 

How to make an offline installer for VS2017

$
0
0

I just got back from Kenya and South Africa and had a great time speaking at NexTech Africa and the Microsoft Tech Summit in Johannesburg. I also got to hang out with my wife's family a bunch. While I was there I was reminded (as one is when one travels) how spoiled many of us with being always connected. Depending on how far out of town you get the quality of internet varies. There's not just bandwidth issues but also issues of latency and reliability.

Visual Studio generally - and Visual Studio 2017 specifically - has an online installer and if you lose connectivity during the installation you can run into problems. However, they haven't got an ISO available for downloading for legal reasons. They can't package up the Android Installer from Google, for example, into an ISO. The user needs to download certain things themselves dynamically.

Fortunately there's docs that walk you through making an offline installer. These could be used to great USB sticks or DVDs that could then be passed out at User Groups or free Events.

  • First, I went to http://visualstudio.com/free and clicked Download. I use VS Community but you can also do this for Enterprise, etc. I downloaded the bootstrapper .exe and put it in its own folder.
  • If you want EVERYTHING possible then you'd run something like this. Note that is my folder there and I selected en-US as my language.
    vs_community.exe --layout e:\vs2017offline --lang en-US
  • However if you don't want EVERYTHING - maybe you just want .NET Core, ASP.NET Core, and Azure, then you'll pass those options in on the command line. They call them "Workloads" but that's a Microsoftism.
    • Here is a list of all the Component IDs you can choose from.
    • I did this to get an offline setup for my main four "workloads." I ran this from a cmd prompt.
      vs_community.exe --layout e:\vs2017offline --lang en-US --add Microsoft.VisualStudio.Workload.Azure Microsoft.VisualStudio.Workload.ManagedDesktop Microsoft.VisualStudio.Workload.NetCoreTools Microsoft.VisualStudio.Workload.NetWeb

It will go and download everything you need. If you want everything then it'll take a while, so hang back.

Give us a minute, we'll be done soon...

If you have trouble or nothing happens, check the dd_bootstrapper*.log file in %TEMP%.

DOS prompt downloading Visual Studio

When it's all done you'll end up with a folder like this that you can copy to a DVD or USB key.

The result of the VS offline Layout generator

One nice aspect of this system is that you can update a "layout" in place. As updates become available for Visual Studio 2017 (RC or otherwise), you can run the --layout command again, pointing to the same layout folder, to ensure that the folder contains the latest components. Only those components that have been updated since the last time --layout was run will be downloaded.

IMPORTANT NOTE: Make sure that your file is named "vs_[SKU].exe." Sometimes you'll end up with a file like vs_community__198521760.1486960229.exe and you'll want to rename it to vs_community.exe for offline to work.

Before you run the installer, you'll want to install the root certificates in the \certificates folder. From the team:

They are the root certs needed to verify the setup application (the stuff installed under ProgramFiles\Visual Studio\2017\Installer) and the catalog (a json file that lists of all the VS components that could be installed by setup).  Most computers will already have these root certs.  But users on Win7 machine may not.  Once you install these certs, setup will be able to authenticate the content being installed is trusted.  You should not remove them after installing them.

I hope this helps you set up offline installers for your classrooms and organizations! You'll save a lot of bandwidth.


Sponsor: Big thanks to Raygun! Join 40,000+ developers who monitor their apps with Raygun. Understand the root cause of errors, crashes and performance issues in your software applications. Installs in minutes, try it today!



© 2016 Scott Hanselman. All rights reserved.
     

dotnet new angular and dotnet new react

$
0
0

I was exploring the "dotnet new" experience last week and how you can extend templates, then today the .NET WebDev blog posted about Steve Sanderson's work around Single Page Apps (SPA). Perfect timing!

image

Since I have Visual Studio 2017 RC and my .NET Core SDK tools are also RC4:

C:\Users\scott\Desktop\fancypants>dotnet --info
.NET Command Line Tools (1.0.0-rc4-004771)

Product Information:
Version: 1.0.0-rc4-004771
Commit SHA-1 hash: 4228198f0e

Runtime Environment:
OS Name: Windows
OS Version: 10.0.15031
OS Platform: Windows
RID: win10-x64
Base Path: C:\Program Files\dotnet\sdk\1.0.0-rc4-004771

I can then do this from the dotnet command line interface (CLI) and install the SPA templates:

dotnet new --install Microsoft.AspNetCore.SpaTemplates::*

The * is the package version so this is getting the latest templates from NuGet. I'm looking forward to using YOUR templates (docs are coming! These are fresh hot bits.)

This command adds new templates to dotnet new. You can see the expanded list here:

Templates                                     Short Name      Language      Tags
------------------------------------------------------------------------------------------
Console Application console [C#], F# Common/Console
Class library classlib [C#], F# Common/Library
Unit Test Project mstest [C#], F# Test/MSTest
xUnit Test Project xunit [C#], F# Test/xUnit
Empty ASP.NET Core Web Application web [C#] Web/Empty
MVC ASP.NET Core Web Application mvc [C#], F# Web/MVC
MVC ASP.NET Core with Angular angular [C#] Web/MVC/SPA
MVC ASP.NET Core with Aurelia aurelia [C#] Web/MVC/SPA
MVC ASP.NET Core with Knockout.js knockout [C#] Web/MVC/SPA
MVC ASP.NET Core with React.js react [C#] Web/MVC/SPA
MVC ASP.NET Core with React.js and Redux reactredux [C#] Web/MVC/SPA
Web API ASP.NET Core Web Application webapi [C#] Web/WebAPI
Solution File sln Solution

See there? Now I've got "dotnet new react" or "dotnet new angular" which is awesome. Now I just "npm install" and "dotnet restore" followed by a "dotnet run" and very quickly I have a great starter point for a SPA application written in ASP.NET Core 1.0 running on .NET Core 1.0. It even includes a dockerfile if I like.

From the template, to help you get started, they've also set up:

  • Client-side navigation. For example, click Counter then Back to return here.
  • Server-side prerendering. For faster initial loading and improved SEO, your Angular 2 app is prerendered on the server. The resulting HTML is then transferred to the browser where a client-side copy of the app takes over. THIS IS HUGE.
  • Webpack dev middleware. In development mode, there's no need to run the webpack build tool. Your client-side resources are dynamically built on demand. Updates are available as soon as you modify any file.
  • Hot module replacement. In development mode, you don't even need to reload the page after making most changes. Within seconds of saving changes to files, your Angular 2 app will be rebuilt and a new instance injected is into the page.
  • Efficient production builds. In production mode, development-time features are disabled, and the webpack build tool produces minified static CSS and JavaScript files.

Go and read about these new SPA templates in depth on the WebDev blog.


Sponsor: Big thanks to Raygun! Join 40,000+ developers who monitor their apps with Raygun. Understand the root cause of errors, crashes and performance issues in your software applications. Installs in minutes, try it today!


© 2016 Scott Hanselman. All rights reserved.
     

Vcpkg recent enhancements

$
0
0

Vcpkg simplifies acquiring and building open source libraries on Windows. Since our first release we have continually improved the tool by fixing issues and adding features. The latest version of the tool is 0.0.71, here is a summary of the changes in this version:

  • Add support for Visual Studio 2017
    • VS2017 detection
    • Fixed bootstrap.ps1 and VS2017 support
    • If both Visual Studio 2015 and Visual Studio 2017 are installed, Visual Studio 2017 tools will be preferred over those of Visual Studio 201
  • Improve vcpkg remove:
    • Now shows all dependencies that need to be removed instead of just the immediate dependencies
    • Add –recurse option that removes all dependencies
  • Fix vcpkg_copy_pdbs()
    • under non-English locale
  • Notable changes for buiding the vcpkg tool:
    • Restructure vcpkg project hierarchy. Now only has 4 projects (down from 6). Most of the code now lives under vcpkglib.vcxproj
    • Enable multiprocessor compilation
    • Disable MinimalRebuild
    • Use precompiled headers
  • Bump required version & auto-downloaded version of cmake to 3.7.2 (was 3.5.x), which includes generators for Visual Studio 2017
  • Bump auto-downloaded version of nuget to 3.5.0 (was 3.4.3)
  • Bump auto-downloaded version of git to 2.11.0 (was 2.8.3)
  • Add 7z to vcpkg_find_acquire_program.cmake
  • Enhance vcpkg_build_cmake.cmake and vcpkg_install_cmake.cmake:
  • Introduce pre-install checks:
    • The install command now checks that files will not be overwrriten when installing a package. A particular file can only be owned by a single package
  • Introduce ‘lib\manual-link’ directory. Libraries placing the lib files in that directory are not automatically added to the link line.

See the Change Log file for more detailed description: https://github.com/Microsoft/vcpkg/blob/master/CHANGELOG.md

As usual your feedback and suggestions really matter. To send feedback, create an issue on GitHub, or contact us at vcpkg@microsoft.com. We also created a survey to collect your suggestion.

The week in .NET – On .NET with Phil Haack, Readline

$
0
0

Previous posts:

Happy 15th Birthday .NET! Happy 20th Anniversary Visual Studio!

This week marks the 15th anniversary since .NET debuted to the world. On February 13th, 2002, the first version of .NET was released as part of Visual Studio.NET. Read Beth Massi’s post, featuring a new interview of Anders Hejlsberg.

It’s also Visual Studio’s 20th anniversary. Julia Liuson tells the story.

On .NET

In this week’s episode, we’re speaking with Phil Haack from GitHub:

This week, we’ll take a look back on the past 15 years of .NET with Beth Massi. We’ll stream live on Channel 9. We’ll take questions on Gitter’s dotnet/home channel and on Twitter. Please use the #onnet tag. It’s OK to start sending us questions in advance if you can’t do it live during the shows.

Package of the week: Readline

Readline is one of those libraries that do one thing, and do it well. Its purpose is to implement a user prompt in your console applications, with standard keyboard shortcuts, command history, and customizable auto-complete. It is built to be a .NET implementation of GNU Readline.

User group meeting of the week: Shared Security Responsibility in the Azure Cloud in Illinois

The Chicago Azure Cloud Users Group holds a meeting on Wednesday, February 15 at 6:00PM in Warrenville, IL on the Azure Shared Security Model.

.NET

ASP.NET

F#

Check out F# Weekly for more great content from the F# community.

Xamarin

UWP

Games

Data

And this is it for this week!

Contribute to the week in .NET

As always, this weekly post couldn’t exist without community contributions, and I’d like to thank all those who sent links and tips. The F# section is provided by Phillip Carter, the gaming section by Stacey Haffner, and the Xamarin section by Dan Rigby, and the UWP section by Michael Crump.

You can participate too. Did you write a great blog post, or just read one? Do you want everyone to know about an amazing new contribution or a useful library? Did you make or play a great game built on .NET?
We’d love to hear from you, and feature your contributions on future posts:

This week’s post (and future posts) also contains news I first read on The ASP.NET Community Standup, on Weekly Xamarin, on F# weekly, and on Chris Alcock’s The Morning Brew.

Viewing all 10804 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>