Quantcast
Channel: Category Name
Viewing all 10804 articles
Browse latest View live

Top stories from the VSTS community – 2018.05.18

$
0
0
Here are top stories we found in our streams this week related to DevOps, VSTS, TFS and other interesting topics, listed in no specific order: TOP STORIES How To Get Started With Performance EngineeringScott Summers delves into Process Engineering, after clarifying the difference between Performance Testing and Performance Engineering. Utilizing VSTS for Wholistic Organization Management... Read More

Glassmaker glassybaby streamlines retail scheduling and fosters communication with Office 365

$
0
0

Today’s post was written by Lee Rhodes, founder of glassybaby in Seattle, Washington.

Profile picture of Lee Rhodes, founder of glassybaby in Seattle, Washington.I launched glassybaby in 1998 after being diagnosed with cancer and finding hope in the light of a handmade votive. At first, I started making votives and giving them away, but that evolved into a business. We call our votive a glassybaby because of its small round shape.

I made giving foundational to the company’s mission, and today, glassybaby donates 10 percent of all sales to a variety of charities dedicated to healing. That’s important to the people who work here. They’re passionate, mission-focused individuals who want their work lives to make a difference. They want to spend their time at work evangelizing the glassybaby mission, not struggling with administrative chores.

Our basic manufacturing process is very artisanal—handblown glass—and we’re not about to bring in machines to do that. It’s a point of pride that every one of our products is touched by human hands. But in other areas of our business, we’re trying to modernize with technology and are using Office 365 to help us do that.

Office 365 is such a broad offering; we put the apps out there and let employees take a self-service approach to finding the tools that meet their needs. We’re a fast-growing company of 400 people, so we don’t have time for lengthy product rollouts and training. We need tools that our employees intuitively understand so they can get themselves up and running.

These tools have been life-changing for us at glassybaby. We’re all in different locations on different schedules; the glassblowers work two shifts a day, the stores open at overlapping times, the corporate staff works from 8:00 AM to 6:00 PM. It’s changed our lives to have a place where we can all go and find information and communicate.

Not only do these tools help us be more efficient, but they’ve enabled more culture and community-building here at glassybaby, and that’s one of the byproducts that I’m really proud of.

I don’t think we could function without these tools; I can’t even imagine working today as we did three years ago. The Office 365 applications have really helped with the flow and the rhythm of glassybaby and alleviated a lot of stress.

—Lee Rhodes

Read the case study to learn more about how glassybaby is empowering its Firstline Workers with intuitive Office 365 scheduling.

The post Glassmaker glassybaby streamlines retail scheduling and fosters communication with Office 365 appeared first on Microsoft 365 Blog.

Console UWP Applications and File-system Access

$
0
0

As announced in the May 2018 issue of MSDN Magazine, the latest Windows update introduces support for Console UWP apps. We’re also introducing support for UWP apps to get broader access to the file-system than was previously possible. In fact, it was during the design and development phases of the Console app support that we realized we’d need to provide these apps with better file-system access – so the two features complement each other nicely.

Console apps represent a significant departure from the “normal” UWP model. Specifically, these apps don’t have their own windows, and don’t use the familiar input mechanisms. One of the attractions of UWP is the rich UI support, which allows you to build beautiful, highly-interactive apps. Console apps, on the other hand, use the standard console window for input and output, and the model is very much command-line driven and plain-text-based.

So, yes, Console apps are very different from traditional UWP apps, and they meet the requirements of a rather different target audience. They’re aimed squarely at developers who need to build admin or developer tools. To help you get started, we published a set of Console App (Universal) Project Templates in the Visual Studio Marketplace.

Let’s walk through creating a very simple Console UWP app, just to explore the possibilities, and to examine some of the options you have in using UWP features. The finished app is an (extremely primitive) implementation of the classic findstr tool – the app is in the Store here, and the sources are in the AppModelSamples repo on GitHub here.

To create this app from scratch, in VS, having installed the project templates, select File | New Project, and then select one of the two C++ project types for Console UWP apps – either C++/CX or C++/WinRT. In this example, I’ll choose C++/WinRT:

Select one of the two C++ project types for Console UWP apps – either C++/CX or C++/WinRT

When prompted for the target platform version information, select a version that’s at least 17083 or later (including Windows Insider builds):

When prompted for the target platform version information, select a version that’s at least 17083 or later (including Windows Insider builds)

Two things to notice in the generated code. First, the Package.appxmanifest declares namespace aliases for desktop4 and iot2, and adds the SupportsMultipleInstances and Subystem attributes to both the Application node and the AppExecutionAlias node:


  <Applications>
    <Application Id="App"
      Executable="$targetnametoken$.exe"
      EntryPoint="MyFindstr.App"
      desktop4:Subsystem="console" 
      desktop4:SupportsMultipleInstances="true" 
      iot2:Subsystem="console" 
      iot2:SupportsMultipleInstances="true" 
      >
...
      
      <Extensions>
          <uap5:Extension 
            Category="windows.appExecutionAlias" 
            Executable="MyFindstr.exe" 
            EntryPoint="MyFindstr.App">
            <uap5:AppExecutionAlias 
              desktop4:Subsystem="console" iot2:Subsystem="console">
              <uap5:ExecutionAlias Alias="MyFindstr.exe" />
            </uap5:AppExecutionAlias>
          </uap5:Extension>
      </Extensions>
      
    </Application>
  </Applications>

From this it should be clear that in this release Console UWP apps are only supported on Desktop and IoT devices. Currently, only Desktop and IoT devices have consoles, so the feature only makes sense there.

The second thing to notice is the main function in Program.cpp. This simply gets hold of the command-line arguments and prints them out in a loop to the console.


int main()
{
    // You can get parsed command-line arguments from the CRT globals.
    wprintf(L"Parsed command-line arguments:n");
    for (int i = 0; i < __argc; i++)
    {
        wprintf(L"__argv[%d] = %Sn", i, __argv[i]);
    }

    wprintf(L"Press Enter to continue:");
    getchar();
}

You can press F5 at this point, just to get the app built and deployed, before changing anything.

The only command-line argument is __argv[0]

As expected, if you execute the app this way, the only command-line argument is __argv[0], which is the executable filename itself. Now that the app is deployed, you can also run it from its tile on the Start menu – and get the exact same results.

However, if you run a separate cmd (or Powershell) window, and execute the app from there, you can then provide any arbitrary command-line arguments you like. This works because installing the app also registered the AppExecutionAlias declared in the manifest. By default, this is set to the same as the project name, although you can of course change this, if you wish:

In contrast, you can also run a separate cmd (or Powershell) window

You can see from the standard generated code that you can use many of the traditional Win32 APIs – the latest Windows update adds a lot more console-specific APIs to the approved list, so that these can be used in Store-published apps.

But, what if you want to use WinRT APIs? Well, of course, this is a UWP app, so you’re free to use WinRT APIs also. This does come with a caveat, however. There are some WinRT APIs that assume that the caller is a regular UWP app with a regular window and view. A console app doesn’t have a regular window or view, so any window-specific APIs won’t work. For the more obvious window-related APIs, you should have no reason to call any of them since you don’t have your own windows, so the point should be moot.

For our simple findstr app, the first change is to add the broadFileSystemAccess restricted capability to the manifest. Add the namespace declaration at the top:


xmlns:rescap="http://schemas.microsoft.com/appx/manifest/foundation/windows10/restrictedcapabilities" 
  IgnorableNamespaces="uap mp desktop4 iot2 rescap">

…and then the capability itself at the bottom:


<Capabilities>
    <Capability Name="internetClient" />
    <rescap:Capability Name="broadFileSystemAccess" />
  </Capabilities>

This corresponds to the File system setting under Privacy in the Settings app. This is where the user can enable or disable broad File-system access on either a per-app basis, or globally for all UWP apps:

File system setting under Privacy in the Settings app

Now for the meat of the app. The classic findstr tool is used to perform string searches in files. It has a lot of command-line options for configuring the search. Internally, it performs all the operations from scratch, using low-level Win32 APIs and without using any higher-level libraries. However, for the purposes of this demo, I’m going to cheat and use the std::regex APIs. One obvious reason for this is because I don’t want to write thousands of lines of code for this sample. A less obvious reason is that I want to explore how to use libraries of different types in my Console UWP app (more on this in a later post). My cut-down version of findstr will take in only two simple command-line arguments:

  • A pattern to search for.
  • A fully-qualified folder path, in which to search.

First, some housekeeping. I’ll add some #includes in my pch.h. I plan on using the Windows.Storage APIs as well as std::regex:


#include "winrt/Windows.Storage.h"
#include <regex>

At the top of my Program.cpp, I’ll add corresponding namespace declarations:


using namespace winrt;
using namespace winrt::Windows::Storage;
using namespace Windows::Foundation::Collections;

Now I can update main to get the search pattern and the starting folder supplied on the command-line. If for some reason I can’t open that folder using the Storage APIs, I’ll just bail with some suitable error code. This could be because the user has disabled File-system access in Settings, or because the folder path supplied was non-existent or badly-formed, and so on. Also note that I’m not doing any input-validation here – but of course in a real app you would be very careful to check all inputs before proceeding, and most especially if your app is expecting to have broad file-system access.


int main()
{
    hstring searchPattern = to_hstring(__argv[1]);
    hstring folderPath = to_hstring(__argv[2]);
    StorageFolder folder = nullptr;
    try
    {
        folder = StorageFolder::GetFolderFromPathAsync(folderPath.c_str()).get();
    }
    catch (...)
    {
        wprintf(L"Error: cannot access folder '%S'n", __argv[2]);
        return 2;
    }

If I did manage to get the folder open, I can then continue to use the Storage APIs to get all the files in this folder for searching, and then to call a recursive GetDirectories function, so that I can walk the folder tree from this point downwards. I’ll write the GetDirectories and SearchFile functions shortly.


    if (folder != nullptr)
    {
        wprintf(L"nSearching folder '%s' and below for pattern '%s'n", folder.Path().c_str(), searchPattern.c_str());
        try
        {
            IVectorView<StorageFolder> folders = folder.GetFoldersAsync().get();

            // Recurse sub-directories.
            GetDirectories(folders, searchPattern);

            // Get the files in this folder.
            IVectorView<StorageFile> files = folder.GetFilesAsync().get();
            for (uint32_t i = 0; i < files.Size(); i++)
            {
                StorageFile file = files.GetAt(i);
                SearchFile(file, searchPattern);
            }
        }
        catch (std::exception ex)
        {
            wprintf(L"Error: %Sn", ex.what());
            return 3;
        }
        catch (...)
        {
            wprintf(L"Error: unknownn");
            return 3;
        }
    }

    return 0;
}

Now for the recursive GetDirectories function. Given the collection of folders in the current directory, I can recursively process all sub-directories. At each level, I can also get all the files for searching.


void GetDirectories(IVectorView<StorageFolder> folders, hstring searchPattern)
{
    try
    {
        for (uint32_t i = 0; i < folders.Size(); i++)
        {
            StorageFolder folder = folders.GetAt(i);

            // Recurse this folder to get sub-folder info.
            IVectorView<StorageFolder> subDir = folder.GetFoldersAsync().get();
            if (subDir.Size() != 0)
            {
                GetDirectories(subDir, searchPattern);
            }

            // Get the files in this folder.
            IVectorView<StorageFile> files = folder.GetFilesAsync().get();
            for (uint32_t j = 0; j < files.Size(); j++)
            {
                StorageFile file = files.GetAt(j);
                SearchFile(file, searchPattern);
            }
        }
    }
    catch (std::exception ex)
    {
        wprintf(L"Error: %Sn", ex.what());
    }
    catch (...)
    {
        wprintf(L"Error: unknownn");
    }
}

Now, the SearchFile function. This is where I’m cheating and using std::regex instead of doing all the work myself. First, I’ll read all the text out of the file with ReadTextAsync. If the file isn’t a text file, this will throw an exception – for simplicity, I’m just reporting the error and moving on. Then, I enhance the raw search pattern to search for any whitespace-delimited “word” that contains the pattern. Finally, I use regex_search to do the pattern-matching, and print out all the found words and their position in the text:


void SearchFile(StorageFile file, hstring searchPattern)
{
    if (file != nullptr)
    {
        try
        {
            wprintf(L"nScanning file '%s'n", file.Path().c_str());
            hstring text = FileIO::ReadTextAsync(file).get();
            std::string sourceText = to_string(text);
            std::smatch match;
            std::string compositePattern = 
                "(\S+\s+){0}\S*" + to_string(searchPattern) + "\S*(\s+\S+){0}";
            std::regex expression(compositePattern);

            while (std::regex_search(sourceText, match, expression))
            {
                wprintf(L"%8d %Sn", match.position(), match[0].str().c_str());
                sourceText = match.suffix().str();
            }
        }
        catch (std::exception ex)
        {
            wprintf(L"Error: %Sn", ex.what());
        }
        catch (...)
        {
            wprintf(L"Error: cannot read text from file.n");
        }
    }
}

One additional convenience: because this is a UWP app, it will have an entry in the Start menu app-list, and the user could pin its tile to Start. So, if could be executed from Start – in which case, of course, it won’t have any command-line arguments. Given how I’m assuming the command-line includes at least a search pattern and folder path, I’ll allow for this with a simple check. So, I’ll add this right at the beginning of main:


    if (__argc < 3)
    {
        ShowUsage();
        return 1;
    }

…and define the matching ShowUsage function:


void ShowUsage()
{
    wprintf(L"Error: insufficient arguments.n");
    wprintf(L"Usage:n");
    wprintf(L"MyFindstr <search-pattern> <fully-qualified-folder-path>.n");
    wprintf(L"Example:n");
    wprintf(L"MyFindstr on D:\Temp.n");

    wprintf(L"nPress Enter to continue:");
    getchar();
}

That’s it! Having re-deployed this version with F5, I can then execute it from a cmd or Powershell window, supplying a search pattern and a starting folder on the command-line:

Having re-deployed this version with F5, you can then execute it from a cmd or Powershell window, supplying a search pattern and a starting folder on the command-line

Now admittedly, my findstr app has only the most basic functionality, but hopefully it does illustrate how easy it is to create a simple Console UWP app that consumes both standard C++ library APIs and WinRT APIs.

For anyone who has already started experimenting with Console UWP apps, it’s worth calling out that the original version of the VS templates had a couple of bugs (or, arguably, a bug and an anomaly), which were fixed in version 1.4. The issues do not prevent the app from building and executing, but they do prevent F5 debugging and also correct packaging and store publication. If you have already created projects using the old templates, you can apply the fixes by making the same changes in the .vcxproj and package.appxmanifest files manually (for both C++/CX and C++/WinRT).

First, the bug. In the package.appxmanifest file, v1.4 added iot2 to the ignorable XML namespaces:


IgnorableNamespaces="uap mp uap5 iot2 desktop4">

Of course, if you’re not targeting IoT devices, then you can instead delete this, and also delete the IoT namespace declaration itself:


  xmlns:iot2="http://schemas.microsoft.com/appx/manifest/iot/windows10/2"

…as well as the iot2-scoped Subsystem declarations in both the Application and AppExecutionAlias nodes:


iot2:Subsystem="console"

Now for the anomaly: v1.3 of the template included declarations in the .vcxproj file that are specific to .NET Native. These are required if your app includes say a C# WinRT component, where .NET Native is used. However, if your app doesn’t include such a component then the declarations will prevent correct packaging. In v1.4 of the template, these have been removed – both the target framework declarations at the top of the file:


    <TargetFrameworkIdentifier>.NETCore</TargetFrameworkIdentifier>
    <TargetFrameworkVersion>v5.0</TargetFrameworkVersion>

…and also multiple instances of the following:


    <PlatformToolset>v141</PlatformToolset>
    <UseDotNetNativeToolchain>true</UseDotNetNativeToolchain>

Please also note that all C++/WinRT templates are still experimental at this stage, and subject to change.

Apologies for any inconvenience caused by the bugs – and please do let us know if you have any feedback on the templates, or on the Console UWP feature.

For documentation, see Create a Universal Windows Platform Console app and File access permissions.

The post Console UWP Applications and File-system Access appeared first on Windows Developer Blog.

A new product badge for Microsoft Store applications

$
0
0

Applications with new web badge for Microsoft Store products.

As many of you know, building quality apps is quite a commitment and investment in time. Once your app is in the Store, the next challenge is getting the word out about your new title and driving traffic to your product. Today, we’d like to announce a new tool for marketing your apps in your own blogs and websites. We’d like to introduce our new web badge for Microsoft Store products.

8 Zip application

The new badge will render in your own website pulling localized logo, pricing (including sale pricing!), ratings and artwork directly from the store catalog. To render this badge for 8 Zip simply embed this script using its Store Id (9wzdncrfhwb8). Please note you must add the Id in two places in the badge script, the “class=” and inside the “mspb-“.


<div id="mspb-nc9jl2ngc1i" class="9wzdncrfhwb8"></div>
<script src="https://storebadge.azureedge.net/src/badge-1.6.1.js"></script>
<script>
mspb('9wzdncrfhwb8', function(badge) {
document.getElementById('mspb-nc9jl2ngc1i').innerHTML = badge;
});
</script>

The button click on the badge will direct your customers to the proper Product Description Page where they make the actual purchase. You can add multiple badges to any single page, just make sure they all use a unique div Id, as shown above.

To see the badge in action, check out XBOX’s @majornelson (www.majornelson.com) who is using the badge to promote Xbox content on his blog.

Example post here: https://majornelson.com/2018/05/03/xbox-live-gold-members-play-for-honor-xcom-2-and-just-cause-3-for-free-this-weekend/.

That’s it! Feel free to promote your apps and games on your own sites.

The post A new product badge for Microsoft Store applications appeared first on Windows Developer Blog.

Exploring Azure App Service – Web Apps and SQL Azure

$
0
0

There is a good chance that your web app uses a database. In my previous post introducing Azure App Service, I showed some of the benefits of hosting apps in Azure App Service, and how easy it is to get a basic site running in a few clicks. In this post I’ll show how to set up a SQL Azure database along with an App Service Web App from Visual Studio, and apply Entity Framework automatically as part of publish.

Let’s get going

To get started, you’ll first need:

  • Visual Studio 2017 with the ASP.NET and web development workload installed (download now)
  • An Azure account:
  • Any ASP.NET or ASP.NET Core app that uses a SQL Database. For the purposes of this post, I’ll create a new ASP.NET Core app with Individual Authentication:
    • On the “New ASP.NET Core Web Application” dialog, click the “Change Authentication” button.
      clip_image002
  • Then select the “Individual User Accounts” radio button and click “OK”.
  • Click OK.

I can now run my project locally (F5) and create user accounts which will be stored in a SQL Server Express Local DB on my machine.

Publishing to App Service with a Database

Let’s publish our application to Azure. To do this, I’ll right click my project in Solution Explorer and choose “Publish”

clip_image003

This brings up the Visual Studio publish target dialog, which will default to the Azure App Service pane with the “Create new” radio button selected. To continue click “Publish”.

This brings up the “Create App Service” dialog (see the “Key App Service Concepts” section of my previous post for an explanation of the fields). To create a SQL Database for our app to use, click the “Create a SQL Database” link in the top right section of the dialog.

clip_image005

This will bring up the “Configure SQL Database” dialog.

  • Note: If you are using a Visual Studio Enterprise subscription, many regions will not let you create a SQL Azure database so I recommend choosing “East US” or “West US 2” depending on where you are located (we are adding logic in in the Visual Studio 2017 15.8 update to remove those regions if that’s the case, but for now you’ll need to choose an appropriate region). To do this, click the “New…” button next to your “Hosting Plan Dropdown” and pick the appropriate region (“East US” or “West US 2”).
  • Since I don’t have an existing SQL Server, the first thing I need to do is create a server to host the database, so I’ll click the “New…” button next to the “SQL Server” dropdown,
  • Choose a location for the database.
  • Provide an administrator user name and password for the server
  • Click “OK”
    clip_image007
  • Make sure the connection string name field matches the name of the connection string your application uses to access the database (if using a new project, it is “DefaultConnection” which will be prepopulated for you).
    clip_image009
  • Click OK
  • Then click the “Create” button on the “Create App Service” dialog

It should take ~2-3 minutes to create all of the resources in Azure, then your application will publish and a browser will open to your home page.

Configuring EF Migrations

At this point there is a database for your app to use in the cloud, but EF migrations have not been applied, so any functionality that relies on the database (e.g. Registering for a user account) will result in an error.

To apply EF migrations to the database:

  • Click the “Configure…” button on the publish summary page
    clip_image011
  • Navigate to the “Settings” tab
  • When it finishes discovering data contexts, expand the “Entity Framework Migrations” section, and check the “Apply this migration on publish” for all of the contexts it finds
    clip_image013
  • Click “Save”
  • Click Publish again, in the output window you should see “Generating Entity framework SQL Scripts” and then “Generating Entity framework SQL Scripts completed successfully”
    clip_image015

That’s it, your web app and SQL Azure database are both configured and running in the cloud.

Conclusion

Hopefully, this post showed you how easy it is to try App Service and SQL Azure. We believe that for most people, App Service is the easiest place to get started with cloud development, even if you need to move to other services in the future for further capabilities (compare hosting options). As always, let us know if you run into any issues, or have any questions below or via Twitter.

Because it’s Friday: Laurel or Yanny

$
0
0

I can only assume you've heard about this already: it's gone wildly viral in the USA, and I assume elsewhere in the world. But it's a lovely example of an auditory illusion, and as regular readers know I like to collect such things (like this one and this one).

What do you hear?! Yanny or Laurel pic.twitter.com/jvHhCbMc8I

— Cloe Feldman (@CloeCouture) May 15, 2018

Kottke has a useful roundup of the best scientific coverage of why some people hear "laurel" and others hear "yanny". Linguist Rachel Gutman also provides a detailed analysis for The Atlantic. For the record though, all I can hear is "laurel", and that indeed is what the actor said for a recording for Vocabulary.com

That's all from us for this week. Next week I'll be at the ROpenSci unconference in Seattle, but we'll back with more on the blog on Monday. In the meantime have a great weekend!

Installing PowerShell Core on a Raspberry Pi (powered by .NET Core)

$
0
0

PowerShell Core on a Raspberry Pi!Earlier this week I set up .NET Core and Docker on a Raspberry Pi and found that I could run my podcast website quite easily on a Pi. Check that post out as there's a lot going on. I can test within a Linux Container and output the test results to the host and then open them in VS. I also explored a reasonably complex Dockerfile that is both multiarch and multistage. I can reliably build and test my website either inside a container or on the bare metal of Windows or Linux. Very fun.

As primarily a Windows developer I have lots of batch/cmd files like "test.bat" or "dockerbuild.bat." They start as little throwaway bits of automation but as the project grows inevitably more complex.

I'm not interested in "selling" anyone PowerShell. If you like bash, use bash, it's lovely, as are shell scripts. PowerShell is object-oriented in its pipeline, moving lists of real objects as standard output. They are different and most importantly, they can live together. Just like you might call Python scripts from bash, you can call PowerShell scripts from bash, or vice versa. Another tool in our toolkits.

PS /home/pi> Get-Process | Where-Object WorkingSet -gt 10MB


NPM(K) PM(M) WS(M) CPU(s) Id SI ProcessName
------ ----- ----- ------ -- -- -----------
0 0.00 10.92 890.87 917 917 docker-containe
0 0.00 35.64 1,140.29 449 449 dockerd
0 0.00 10.36 0.88 1272 037 light-locker
0 0.00 20.46 608.04 1245 037 lxpanel
0 0.00 69.06 32.30 3777 749 pwsh
0 0.00 31.60 107.74 647 647 Xorg
0 0.00 10.60 0.77 1279 037 zenity
0 0.00 10.52 0.77 1280 037 zenity

Bash and shell scripts are SUPER powerful. It's a whole world. But it is text based (or json for some newer things) so you're often thinking about text more.

pi@raspberrypidotnet:~ $ ps aux | sort -rn -k 5,6 | head -n6

root 449 0.5 3.8 956240 36500 ? Ssl May17 19:00 /usr/bin/dockerd -H fd://
root 917 0.4 1.1 910492 11180 ? Ssl May17 14:51 docker-containerd --config /var/run/docker/containerd/containerd.toml
root 647 0.0 3.4 155608 32360 tty7 Ssl+ May17 1:47 /usr/lib/xorg/Xorg :0 -seat seat0 -auth /var/run/lightdm/root/:0 -nolisten tcp vt7 -novtswitch
pi 1245 0.2 2.2 153132 20952 ? Sl May17 10:08 lxpanel --profile LXDE-pi
pi 1272 0.0 1.1 145928 10612 ? Sl May17 0:00 light-locker
pi 1279 0.0 1.1 145020 10856 ? Sl May17 0:00 zenity --warning --no-wrap --text

You can take it as far as you like. For some it's intuitive power, for others, it's baroque.

pi@raspberrypidotnet:~ $ ps -eo size,pid,user,command --sort -size | awk '{ hr=$1/1024 ; printf("%13.2f Mb ",hr) } { for ( x=4 ; x<=NF ; x++ ) { printf("%s ",$x) } print "" }'

0.00 Mb COMMAND
161.14 Mb /usr/bin/dockerd -H fd://
124.20 Mb docker-containerd --config /var/run/docker/containerd/containerd.toml
78.23 Mb lxpanel --profile LXDE-pi
66.31 Mb /usr/lib/xorg/Xorg :0 -seat seat0 -auth /var/run/lightdm/root/:0 -nolisten tcp vt7 -novtswitch
61.66 Mb light-locker

Point is, there's choice. Here's a nice article about PowerShell from the perspective of a Linux user. Can I install PowerShell on my Raspberry Pi (or any Linux machine) and use the same scripts in both places? YES.

For many years PowerShell was a Windows-only thing that was part of the closed Windows ecosystem. In fact, here's video of me nearly 12 years ago (I was working in banking) talking to Jeffrey Snover about PowerShell. Today, PowerShell is open source up at https://github.com/PowerShell with lots of docs and scripts, also open source. PowerShell is supported on Windows, Mac, and a half-dozen Linuxes. Sound familiar? That's because it's powered (ahem) by open source cross platform .NET Core. You can get PowerShell Core 6.0 here on any platform.

Don't want to install it? Start it up in Docker in seconds with

docker run -it microsoft/powershell

Sweet. How about Raspbian on my ARMv7 based Raspberry Pi? I was running Raspbian Jessie and PowerShell is supported on Raspbian Stretch (newer) so I upgraded from Jesse to Stretch (and tidied up and did the firmware while I'm at it) with:

$ sudo apt-get update

$ sudo apt-get upgrade
$ sudo apt-get dist-upgrade
$ sudo sed -i 's/jessie/stretch/g' /etc/apt/sources.list
$ sudo sed -i 's/jessie/stretch/g' /etc/apt/sources.list.d/raspi.list
$ sudo apt-get update && sudo apt-get upgrade -y
$ sudo apt-get dist-upgrade -y
$ sudo rpi-update

Cool. Now I'm on Raspbian Stretch on my Raspberry Pi 3. Let's install PowerShell! These are just the most basic Getting Started instructions. Check out GitHub for advanced and detailed info if you have issues with prerequisites or paths.

NOTE: Here I'm getting PowerShell Core 6.0.2. Be sure to check the releases page for newer releases if you're reading this in the future. I've also used 6.1.0 (in preview) with success. The next 6.1 preview will upgrade to .NET Core 2.1. If you're just evaluating, get the latest preview as it'll have the most recent bug fixes.

$ sudo apt-get install libunwind8

$ wget https://github.com/PowerShell/PowerShell/releases/download/v6.0.2/powershell-6.0.2-linux-arm32.tar.gz
$ mkdir ~/powershell
$ tar -xvf ./powershell-6.0.2-linux-arm32.tar.gz -C ~/powershell
$ sudo ln -s ~/powershell/pwsh /usr/bin/pwsh
$ sudo ln -s ~/powershell/pwsh /usr/local/bin/powershell
$ powershell

Lovely.

GOTCHA: Because I upgraded from Jessie to Stretch, I ran into a bug where libssl1.0.0 is getting loaded over libssl1.0.2. This is a complex native issue with interaction between PowerShell and .NET Core 2.0 that's being fixed. Only upgraded machines like mind will it it, but it's easily fixed with sudo apt-get remove libssl1.0.0

Now this means my PowerShell build scripts can work on both Windows and Linux. This is a deeply trivial example (just one line) but note the "shebang" at the top that lets Linux know what a *.ps1 file is for. That means I can keep using bash/zsh/fish on Raspbian, but still "build.ps1" or "test.ps1" on any platform.

#!/usr/local/bin/powershell

dotnet watch --project .hanselminutes.core.tests test /p:CollectCoverage=true /p:CoverletOutputFormat=lcov /p:CoverletOutput=./lcov

Here's a few totally random but lovely PowerShell examples:

PS /home/pi> Get-Date | Select-Object -Property * | ConvertTo-Json

{
"DisplayHint": 2,
"DateTime": "Sunday, May 20, 2018 5:55:35 AM",
"Date": "2018-05-20T00:00:00+00:00",
"Day": 20,
"DayOfWeek": 0,
"DayOfYear": 140,
"Hour": 5,
"Kind": 2,
"Millisecond": 502,
"Minute": 55,
"Month": 5,
"Second": 35,
"Ticks": 636623925355021162,
"TimeOfDay": {
"Ticks": 213355021162,
"Days": 0,
"Hours": 5,
"Milliseconds": 502,
"Minutes": 55,
"Seconds": 35,
"TotalDays": 0.24693868190046295,
"TotalHours": 5.9265283656111105,
"TotalMilliseconds": 21335502.1162,
"TotalMinutes": 355.59170193666665,
"TotalSeconds": 21335.502116199998
},
"Year": 2018
}

You can take PowerShell objects to and from Objects, Hashtables, JSON, etc.

PS /home/pi> $hash | ConvertTo-Json

{
"Shape": "Square",
"Color": "Blue",
"Number": 1
}
PS /home/pi> $hash = @{ Number = 1; Shape = "Square"; Color = "Blue"}
PS /home/pi> $hash

Name Value
---- -----
Shape Square
Color Blue
Number 1


PS /home/pi> $hash | ConvertTo-Json
{
"Shape": "Square",
"Color": "Blue",
"Number": 1
}

Here's a nice one from MCPMag:

PS /home/pi> $URI = "https://query.yahooapis.com/v1/public/yql?q=select  * from weather.forecast where woeid in (select woeid from geo.places(1) where  text='{0}, {1}')&format=json&env=store://datatables.org/alltableswithkeys"  -f 'Omaha','NE'

PS /home/pi> $Data = Invoke-RestMethod -Uri $URI
PS /home/pi> $Data.query.results.channel.item.forecast|Format-Table

code date day high low text
---- ---- --- ---- --- ----
39 20 May 2018 Sun 62 56 Scattered Showers
30 21 May 2018 Mon 78 53 Partly Cloudy
30 22 May 2018 Tue 88 61 Partly Cloudy
4 23 May 2018 Wed 89 67 Thunderstorms
4 24 May 2018 Thu 91 68 Thunderstorms
4 25 May 2018 Fri 92 69 Thunderstorms
34 26 May 2018 Sat 89 68 Mostly Sunny
34 27 May 2018 Sun 85 65 Mostly Sunny
30 28 May 2018 Mon 85 63 Partly Cloudy
47 29 May 2018 Tue 82 63 Scattered Thunderstorms

Or a one-liner if you want to be obnoxious.

PS /home/pi> (Invoke-RestMethod -Uri  "https://query.yahooapis.com/v1/public/yql?q=select  * from weather.forecast where woeid in (select woeid from geo.places(1) where  text='Omaha, NE')&format=json&env=store://datatables.org/alltableswithkeys").query.results.channel.item.forecast|Format-Table

Example: This won't work on Linux as it's using Windows specific AIPs, but if you've got PowerShell on your Windows machine, try out this one-liner for a cool demo:

iex (New-Object Net.WebClient).DownloadString("http://bit.ly/e0Mw9w")

Thoughts?


Sponsor: Check out JetBrains Rider: a cross-platform .NET IDE. Edit, refactor, test and debug ASP.NET, .NET Framework, .NET Core, Xamarin or Unity applications. Learn more and download a 30-day trial!



© 2018 Scott Hanselman. All rights reserved.
     

Azure IoT Reference Architecture update

$
0
0

Last week, we released an update to the Azure IoT Reference Architecture Guide. Our focus for the update was to bring the document forward to the latest Azure IoT cloud native recommended architecture and latest technology implementation recommendations. The updated guide includes an overview of the IoT space, recommended subsystem factoring for solutions, and prescriptive technology recommendations per subsystem. Technical content added includes coverage of topics such as microservices, containers, orchestrators (e.g. Kubernetes and Service Fabric), serverless usage, Azure Stream Analytics, and Edge devices. Major updates were made to the Stream Processing and Storage subsystem sections of the document covering rules processing and storage technology options on Azure across differing types of IoT solutions.

The IoT Architecture Guide aims to accelerate customers building IoT Solutions on Azure by providing a proven production ready architecture, with proven technology implementation choices, and with links to Solution Accelerator reference architecture implementations such as Remote Monitoring and Connected Factory. The document offers an overview of the IoT space, recommended subsystem factoring for scalable IoT solutions, prescriptive technology recommendations per subsystems, and detailed sections per subsystem that explore use cases and technology alternatives.

Future updates – please provide feedback and ask questions

We will be making consistent updates over the coming months and would appreciate your feedback with building production solutions with Azure. Please use our User Voice channel to provide architecture and technology suggestions. We encourage you to browse what others are suggesting, vote for your favorites, and enter suggestions of your own.


Changes coming to PowerShell (preview) in Azure Cloud Shell

$
0
0

Azure Cloud Shell provides browser-based authenticated shell access to Azure from virtually anywhere. Cloud Shell gives the users a rich environment with common tools that is updated and maintained by Microsoft.

Currently, Azure Cloud Shell provides two environments that can be launched from Azure Portal, dedicated URL, Azure documentation, Visual Studio Code via the Azure Account extension, and Azure App:

  • Bash in Cloud Shell that runs Bash shell on Ubuntu Linux, which was made generally available in November 2017

  • PowerShell in Cloud Shell that runs Windows PowerShell 5.1 on Windows Server Core and has been in preview since September 2017

In this post, we are listing the key upcoming changes to the PowerShell experience in Azure Cloud Shell, namely:

  • Faster startup time
  • PowerShell Core 6 as the default experience
  • Running on a Linux container
  • Persistent Tool Settings

Faster Startup Time

We are well-aware that the startup time of PowerShell in Azure Cloud Shell is well below the user’s expectation. For past couple of months, the team has been working hard to make significant improvements in this area. We expect to deliver multi-fold improvements in the startup time for PowerShell experience (and also make Bash experience faster).

Default to PowerShell Core 6

In January 2018, PowerShell Core 6 reached its general availability (GA). With the ecosystem of PowerShell Core 6 growing, it's the perfect opportunity to make PowerShell Core 6 the default PowerShell experience in Cloud Shell. To support easy management of Azure resources, all of the Azure PowerShell modules are on path to be supported on PowerShell Core 6 currently in preview.

Consistent Tool Availability

To ensure the best command-line tools experience while using Azure Cloud Shell, the PowerShell experience will be switching to a Linux container running PowerShell Core 6. This change will enable a consistent toolset experience across the PowerShell and Bash experiences in Cloud Shell.

Persistent Tool Settings

In addition to saving your modules and scripts to Cloud Drive, persistent settings for available tools, such as Git and SSH, will be automatically saved to your Cloud Drive. This will remove the need for any additional set-up for these tools, as currently needed.

FAQs

What are the changes in my PowerShell experience?

With the move to PowerShell Core 6 on a Linux container, there are a few key changes to the user experience:

  • By default, PowerShell experience is case-insensitive. The Linux file system operations are case sensitive (you could have files named file, File, and FILE in the same folder). After there changes, users have to mindful of casing when preforming file system operations.

  • Some common aliases on Windows, such as ls, sleep, etc., that map to built-in UNIX commands, will no longer work as before.

The modules I use are not supported in PowerShell Core 6, what do I do?

The PowerShell team is currently working with the module owners to port them to .NET Core. While some modules already are complete, or in the works like AzureRM, there is more to be done. We are prioritizing modules based on their usage. If there is a module that you would like to see in Cloud Shell that is not ported, provide feedback directly to the module owners.

What is happening to Cloud Shell based on Windows?

In Azure Cloud Shell, there will no longer be a Windows-based experience. Since most of the operations from Cloud Shell happens against some service endpoints and there is little to no need to manage things in the underlying OS, swapping of the underlying OS should have minimal impact. Cloud Drive will continue to be available across both experiences and serves as the best place to save your work.

What is the difference between the Bash and the PowerShell experiences?

There is no difference between the two experiences. Now that both experiences run in the same OS, the tool set will be the same. From Bash, you will be able to start the PowerShell experience by running pwsh as well as from PowerShell experience, you will be able to start the Bash experience by running bash.

How can I provide feedback?

You can provide feedback via the Cloud Shell UserVoice, or email pscloudshell@microsoft.com.

Azure.Source – Volume 32

$
0
0

Azure the cloud for all – highlights from Microsoft BUILD 2018 - In this final recap of Microsoft Build 2018, Julia White, Corporate Vice President, Microsoft Azure pulled together some key highlights and top sessions to watch. This post summarizes what's new across tools, containers+serverless, IoT, and Data+AI.

"Hey! You! Get on my cloud." Corey Sanders at Build 2018.

“Hey! You! Get on my cloud.” Corey Sanders at Build 2018.

Now in preview

Public preview: Query across applications in log alerts - You can use Azure Application Insights to monitor a distributed modern cloud application. In the same spirit, log alerts enable you to combine data across various apps. Cross-app query support in log alerts is currently in preview.

Now generally available

Protect virtual machines across different subscriptions with Azure Security Center - Azure Security Center’s Cross-Subscription Workspace Selection enables you to collect and monitor data in one location from virtual machines that run in different workspaces, subscriptions, and run queries across them.

Announcing SQL Advanced Threat Protection (ATP) and SQL Vulnerability Assessment general availability - SQL Vulnerability Assessment (VA) provides you a one-stop-shop to discover, track and remediate potential database vulnerabilities. It helps give you visibility into your security state, and includes actionable steps to investigate, manage and resolve security issues, and enhance your database fortifications. VA is available for Azure SQL Database customers as well as for on-premises SQL Server customers via SSMS.

Also generally available

Tuesdays with Corey

We're talkin' Azure Low Priority VMs - Corey Sanders, Corporate VP - Microsoft Azure Compute team sat down with Meagan McCrory, Senior PM on the Azure Compute Team to talk about the availability of Low Priority VMs.

News and updates

Enhance productivity using Azure Data Factory Visual Tools - Azure Data Factory (ADF) visual tools enable a rich, interactive visual authoring and monitoring experience to iteratively create, configure, test, deploy and monitor data integration pipelines without any friction. The main goal of the ADF visual tools is to allow you to be productive with ADF by getting pipelines up and running quickly without requiring to write a single line of code Read this post for the latest updates.

Azure Data Factory Visual Tools

Azure Data Factory Visual Tools

Accelerate innovation with Consulting Services in Azure Marketplace - As announced at Build 2018, Azure customers can now easily maximize the potential of the intelligent cloud through newly released Azure Marketplace Consulting Services offerings. The new offer provides assessments, briefings, implementations, proof-of-concepts, and workshops by Microsoft partners with a Silver or Gold cloud competency.

Azure Marketplace new offers: April 16-30 - The Azure Marketplace is the premier destination for all your software needs – certified and optimized to run on Azure. Find, try, purchase, and provision applications & services from hundreds of leading software providers. In the second half of April we published 15 new offers, including: Kubernetes Sandbox Certified by Bitnami and Forcepoint Next Generation Firewall.

Additional news and updates

Technical content and training

Using the Azure IoT Python SDK: make sure you check the version! - Azure IoT Python SDK is a wrapper on top of our Azure IoT C SDK, and we release binary packages on pip for Windows, Ubuntu, and Raspbian, all of which are compatible with Python 2 and Python 3. This post covers the ups and downs of this approach, and provides developers with guidance on how to use it.

Help improve our Azure docs! - This post introduces a new experiment to Azure docs, code-named Aladdin, which is an AI assistant that connects you to relevant Azure documentation and helps you accomplish your work more efficiently. The Aladdin extension is available for Chrome and coming soon for Edge.

Aladdin extension in Azure docs

Aladdin extension in Azure docs

10 great things about Kafka on HDInsight - Dhruv Goel, Program Manager for Azure Big Data, provides his list of the 10 great things about Kafka on HDInsight (Microsoft Azure’s managed Kafka cluster offering), including its ease of use and scalability.

Why developers should enable Azure Security Center’s Just-in-Time VM Access - Learn how Azure Security Center’s Just-in-Time VM Access can help you secure virtual machines that are running your applications and code. Just-in-Time VM Access is a part of Security Center’s Standard Tier, which is free for the first 60 days.

Detect malicious activity using Azure Security Center and Azure Log Analytics - Investigating malicious activity on their systems can be tedious and knowing where to start is challenging. Azure Security Center makes it simple for you to respond to detected threats. Learn how Security Center’s ability to detect threats using machine learning and Azure Log Analytics can help you keep pace with rapidly evolving cyberattacks.

Azure tips and tricks

Deploy an Azure Web App using only the CLI

Working with Files in Azure App Service

Customers and partners

Accelerate your cloud data warehouse with automation tools - Data warehouse automation (DWA) tools bring the benefits of meta-data driven automation, and code generation to streamline developing and managing a data warehouse solution. Learn which data warehouse automation partners have certified their tools with SQL DW, and read what customers have to say about data warehouse automation tools.

New Azure Network Watcher integrations and Network Security Group Flow Logging updates - Azure Network Watcher provides you the ability to monitor, diagnose, and gain insights into your network in Azure. This post highlights two of the most recent partners who offer tools that seamlessly integrate and understand your network in Azure: McAfee and NSG.

The Azure Podcast

The Azure Podcast: Episode 229 - Live from Build with Jessica Deen - On Day 2 of Build 2018, the crew chats with Jessica Deen, Cloud Developer Advocate, about VSTS and Kubernetes.

Events

Join Microsoft at Bio-IT World - Last week, Microsoft sponsored the Bio-IT World Conference and Expo in Boston. Learn about research and product development being Microsoft is doing in the life sciences and healthcare industries.

Internet of Things Show

The IoT Show | IoT Projects by the Azure CAT E2E team - There is a team within Azure working closely with customers on special projects, integrating latest Azure technologies. The IoT Show got the chance to visit their lair just before Build to get a sneak peek at some of the demos they were preparing for the event. Drones, Azure IoT, Machine Learning on the edge and lots of fun!.

The IoT Show | A tour of the IoT booth at Build 2018 - The IoT Show was live on the IoT booth at Build 2018. You can re-live the tour of the demos on demand and check out all the cool demos the various IoT teams put together.

The IoT Show | Azure IoT Central and Devices - Marcello Majonchi joined Olivier on the IoT Show to discuss some of his cool project using Azure IoT Central and a set of development boards such as the ElectricImp, the MXChip DevKit, or Raspberry Pi.

Developer spotlight

Build 2018 learning path: IoT apps in the cloud or on the edge - Connecting devices to the cloud can generate new business opportunities. Predictive maintenance can save costs and improve customer satisfaction. Tracking shipments and optimizing routes can improve efficiency. Remotely managing healthcare equipment can provide better care and save costs. The possibilities with IoT are endless. Microsoft provides the most comprehensive IoT portfolio that enables you to build the solutions of tomorrow using your existing skills. Choose from SaaS and PaaS offerings and extend your IoT application to the intelligent edge and to Microsoft proprietary hardware.

Build 2018 learning path: Migrate existing apps to the cloud - Microsoft Azure can bring you incredible value. From enhanced security to reduced hands-on server maintenance to increased application availability and reduced costs. To benefit from this value, migrate your existing infrastructure, platform, services, and applications to the cloud. Microsoft provides tools, options, and guidance to help you migrate and modernize your applications over time, in a way that suits your business needs.

Build 2018 learning path: Extend existing apps with cloud services - Whether you have desktop, web or mobile applications, Microsoft can help you to enhance them by providing intelligent services and features that can make your application smarter, more reliable, scalable and more cost-efficient. Features like cognitive services add easy-to-consume intelligence, Application Insights provide advanced application monitoring, and App Service makes your app scalable and deployable with no downtime. Use these and other services by just bolting them on.

AI Toolkit for Azure IoT Edge - This toolkit shows you how to package deep learning models in Azure IoT Edge-compatible Docker containers and expose those models as REST APIs. We've included examples to help get you started, but the possibilities are endless. We'll be adding new examples and tools often. The models can be used as-is or and customized to better meet your specific needs and use cases.

Kubernetes virtual kubelet provider for managing Azure IoT Edge deployments - Azure IoT Edge Connector leverages the Virtual Kubelet project to provide a virtual Kubernetes node backed by an Azure IoT hub. It translates a Kubernetes pod specification to an IoT Edge Deployment and submits it to the backing IoT hub. The edge deployment contains a device selector query that controls which subset of edge devices the deployment will be applied to.

Tutorial: Explore the capabilities of the remote monitoring solution accelerator - This tutorial shows you the key capabilities of the remote monitoring solution. To introduce these capabilities, the tutorial showcases common customer scenarios using a simulated IoT application for a company called Contoso. The tutorial helps you understand the typical IoT scenarios the remote monitoring solution provides out-of-the-box.

AI Show

AI Show | Conversational AI and Authentication - Connecting your bot to resources like the Microsoft Graph API, LinkedIn, or Uber can be challenging. In this video, learn about the newly built-in authentication cards supported in Azure Bot Service, enabling your bot to authenticate users against a wide variety of auth providers and perform tasks on their behalf.

AI Show | Conversational AI: Bot Building Tools - Watch to learn about the latest bot building tools that are AI services aware, and support end-to-end bot development workflow to create, build, test, deploy and manage your bot. The tools we provide include a set of rich command line tools that help you create and manage bots and channel registrations, and manage AI connected services. We will showcase how different tools assist you in different stages of the bot development process, helping you with conversational modelling, LUIS, QnA maker and language model dispatching. We will also give a sneak peak of the new and extensible bot emulator, which allows management of bots, connected services and transcripts. All the tools and the emulator are open source – visit our GitHub repos and clone, contribute, comment and be a part of creating great bot tools!

Video: speeding up R with parallel programming in the cloud

$
0
0

I had a great time in Budapest last week for the eRum 2018 conference. The organizers have already made all of the videos available online. Here's my presentation: Speeding up R with Parallel Programming in the cloud.

You can find (and download) my presentation slides here. And if you just want the references from the last slide, here are the links:

Announcing the deprecation of the WIT and Test Client OM at Jan 1, 2020

$
0
0
Since the first version of Team Foundation Server (TFS) in 2005, we have provided a set of SOAP APIs for programmatic interaction with Work Items and Tests. In recent years, REST has replaced SOAP as the preferred method for building integrations offering a simpler and more flexible programming model, support for multiple data formats, and... Read More

A Penny Saved is a Ton of Serverless Compute Earned

$
0
0

Scott Guthrie recently shared one of my favorite anecdotes on his Azure Red Shirt Tour. A Microsoft customer regularly invokes 1 billion (yes, that’s with a “B”) Azure Functions per day. The customer reached out to support after the first month thinking there was a bug in the billing system, only to find out that the $72 was in fact correct. How is that possible? Azure Functions is a serverless compute platform that allows you to focus on code that only executes when triggered by events, and you only pay for CPU time and memory used during execution (versus a traditional web server where you are paying a fee even if your app is idle). This is called micro-billing, and is one key reason serverless computing is so powerful.

Curious about Azure Functions? Follow the link https://aka.ms/go-funcs to get up and running with your first function in minutes.

Scott Guthrie Red Shirt

Scott Guthrie on the Azure Red Shirt Tour

In fact, micro-billing is so important, it’s one of three rules I use to verify if a service is serverless. There is not an official set of rules and there is no standard for serverless. The closest thing to a standard is the whitepaper published by the Cloud Native Computing Foundation titled CNCF WG-Serverless Whitepaper v1.0 (PDF). The paper describes serverless computing as “building and running applications that do not require server management.” The paper continues to state they are “executed, scaled, and billed in response to the exact demand needed at the moment.”

It’s easy to label almost everything serverless, but there is a difference between managed and serverless. A managed service takes care of responsibilities for you, such as standing up a website or hosting a Docker container. Serverless is a managed service but requires a bit more. Here is Jeremy’s Serverless Rules.

  1. The service should be capable of running entirely in the cloud. Running locally is fine and often preferred for developing, testing, and debugging, but ultimately it should end up in the cloud.
  2. You don’t have to configure a virtual machine or cluster. Docker is great, but containers require a Docker host to run. That host typically means setting up a VM and, for resiliency and scale, using an orchestrator like Kubernetes to scale the solution. There are also services like Azure Web Apps that provide a fully managed experience for running web apps and containers, but I don’t consider them serverless because they break the next rule.
  3. You only pay for active invocations and never for idle time. This rule is important, and the essence of micro-billing. ACI is a great way to run a container, but I pay for it even when it’s not being used. A function, on the other hand, only bills when it’s called.

These rules are why I stopped calling managed databases “serverless.” So, what, then, does qualify as serverless?

The Azure serverless platform includes Azure Functions, Logic Apps, and Event Grid. In this post, we’ll take a closer look at Azure Functions.

Azure Functions

Azure Functions allows you to write code that is executed based on an event, or trigger. Triggers may include an HTTP request, a timer, a message in a queue, or any other number of important events. The code is passed details of the trigger but can also access bindings that make it easier to connect to resources like databases and storage. The serverless Azure Functions model is based on two parameters: invocations and gigabyte seconds.

Invocations are the number of times the function is invoked based on its trigger. Gigabyte seconds is a function of memory usage. Image a graph that shows time on the x-axis and memory consumption on the y-axis. Plot the memory usage of your function over time. Gigabyte seconds represent the area under the curve.

Let’s assume you have a microservice that is called every minute and takes one second to scan and aggregate data. It uses a steady 128 megabytes of memory during the run. Using the Azure Pricing Calculator, you’ll find that the cost is free. That’s because the first 400,000 Gigabyte seconds and 1 million invocations are free every month. Running every second (there are 2,628,000 seconds in a month) with double memory (256 megabytes), the entire monthly cost is estimated at $4.51.

Azure Functions pricing

Pricing calculator for Azure Functions

Recently I tweeted about my own experience with serverless cost (or lack thereof). I wrote a link-shortening tool. It uses a function to take long URLs and turn them into a shorter code I can easily share. I also have a function that takes the short code and performs the redirect, then stores the data in a queue. Another microservice processes items in the queue and stores metadata that I can analyze for later. I have tens of thousands of invocations per month and my total cost is less than a dollar.

Link shortener stats

A tweet about cost of running serverless code in Azure

Do I have your attention?

In future posts I will explore the cost model for Logic Apps and Event Grid. In the meantime…

Learn about and get started with your first Azure Function by following this link: https://aka.ms/go-funcs

.NET Framework May 2018 Preview of Quality Rollup for Windows 10 1709 (Fall Creators Update)

$
0
0

Today, we are releasing the May 2018 Preview of Quality Rollup for Windows 10 1709 (Fall Creators Update).

Quality and Reliability

This release contains the following quality and reliability improvements.

CLR

  • Resolves an issue in deserialization when using a collection, for example, ConcurrentDictionary by ignoring casing. [524135]
  • Resolves instances of high CPU usage with background garbage collection. This can be observed with the following two functions on the stack: clr!*gc_heap::bgc_thread_function, ntoskrnl!KiPageFault. Most of the CPU time is spent in the ntoskrnl!ExpWaitForSpinLockExclusiveAndAcquire function. This change updates background garbage collection to use the CLR implementation of write watch instead of the one in Windows. [574027]

Networking

  • Fixed a problem with connection limit when using HttpClient to send requests to loopback addresses. [539851]

WPF

  • A crash can occur during shutdown of an application that hosts WPF content in a separate AppDomain. (A notable example of this is an Office application hosting a VSTO add-in that uses WPF.) [543980]
  • Addresses an issue that caused XAML Browser Applications (XBAP’s) targeting .NET 3.5 to sometimes be loaded using .NET 4.x runtime incorrectly. [555344]
  • A WPF application can crash due to a NullReferenceException if a Binding (or MultiBinding) used in a DataTrigger (or MultiDataTrigger) belonging to a Style (or Template, or ThemeStyle) reports a new value, but whose host element gets GC’d in a very narrow window of time during the reporting process. [562000]
  • A WPF application can crash due to a spurious ElementNotAvailableException. This can arise if:
    1.Change TreeView.IsEnabled
    2.Remove an item X from the collection
    3.Re-insert the same item X back into the collection
    4.Remove one of X’s subitems Y from its collection
    (Step 4 can happen any time relative to steps 2 and 3, as long as it’s after step 1. Steps 2-4 must occur before the asynchronous call to UpdatePeer, posted by step 1; this will happen if steps 1-4 all occur in the same button-click handler.) [555225]

Note: Additional information on these improvements is not available. The VSTS bug number provided with each improvement is a unique ID that you can give Microsoft Customer Support, include in StackOverflow comments or use in web searches.

Getting the Update

The Preview of Quality Rollup is available via Windows Update, Windows Server Update Services, and Microsoft Update Catalog.

Microsoft Update Catalog

You can get the update via the Microsoft Update Catalog.

Product Version Preview of Quality Rollup KB
Windows 10 1709 (Fall Creators Update) Catalog
4103714

Previous Monthly Rollups

The last few .NET Framework Monthly updates are listed below for your convenience:

New updates for Microsoft Azure Storage Explorer

$
0
0

After the recent general availability for Storage Explorer, we also added new features in the latest 1.1 release to align with Azure Storage platform:

  • Azurite cross-platform emulator
  • Access tiers that efficiently consumes resources based on how frequently a blob is accessed
  • The removal of SAS URL start time to avoid datacenter synchronization issues

Storage Explorer is a great tool for managing contents of your Azure storage account. You can upload, download, and manage blobs, files, queues, and Cosmos DB entities. Additionally, you may gain easy access to manage your Virtual Machine disks, work with either Azure Resource Manager or classic storage accounts, plus manage and configure cross-origin resource sharing (CORS) rules. Storage Explorer also works on public Azure, Sovereign Azure Cloud, as well as Azure Stack.

Let’s go through some example scenarios where Storage Explorer helps with your daily job.

Sign-in to your Azure Cloud from Storage Explorer

To get started using Storage Explorer, sign in to your Azure account and stay connected to your subscriptions. If you have an account for Azure, Azure Sovereign Cloud, or Azure Stack, you can easily sign-in to your account from Storage Explorer Add an Account dialog.

In addition, now Storage Explorer shares the same sign-in component with Visual Studio. This means you only need to sign-in once from Visual Studio or Storage Explorer, your developer tools will stay connected to your Azure subscriptions.

se-vs-signinsign-in-se

Upload and download blobs

Blob storage is a convenient way to store various types of files. You might have many photos related to an event that you want to be saved in blob storage, so your website can access it. Or you might have a very large VHD file you want to backup using blob storage. Storage Explorer ensures you can upload and download blobs efficiently while maintaining data integrity. We will continually improve the uploading and downloading performance to provide you even better experience.

se-upload

Efficiently consume storage resource with access tiers

Sometimes you have data that’s frequently access by your web application, such as the media resources of your website front page. Other times, you might have short-term backup and disaster recovery datasets, which don’t need to be accessed very frequently. Wouldn’t it be nice if you pay for storage access cost based on what you need?

In the latest Storage Explorer, you can leverage the Storage platform capability to configure access tiers for a Storage Account – hot, cold, or archive. Learn more about Storage access tiers at Azure Blob storage: Hot, cool, and archive storage tiers.

update access tier

Easily obtain SAS URL for sharing and access control

Have you ever been in the situation where you need to share a large file with someone else, but the file is too large to be attached in an email? With Storage Explorer, you can quickly upload the file to a blob storage, then obtain a temporary access link to the file and send the link to your colleagues or friends. You can configure how long before the link will expire, and access control policies on the link. Easily and securely share content with Storage Explorer!

shared access signature

Capture snapshots for blobs and file share

Before modifying a file, you might want to make a backup copy of the existing one for future references. Storage Explorer enables you to backup blobs and file shares as it appears at a moment in time. You can create snapshot for each individual blob. For file shares, the snapshot is taken at the entire share level. Check out these features today!

share

Accessibility support

We believe technology should empower every individual to work with efficiency and high productivity. As part of this release, we are proud to deliver features including better keyboard navigation, such as quickly jumping between panels, improved screen reader support, such as adding aria-live tags to activities, and tons of little fixes to our high contrast themes. We’ll actively look for your feedback on GitHub.

Open feedback platform

Instead of filling out a survey to offer a suggestion, you can now open Storage Explorer issues on GitHub. You can search existing issues, add comments to issues you fell are more important, share workarounds with other users, and receive updates when issues are resolved.

Next steps


Blue-Green deployments using Azure Traffic Manager

$
0
0

Azure Traffic Manager, Azure’s DNS based load balancing solution, is used by customers for a wide variety of use cases including routing a global user base to endpoints on Azure that will give them the fastest, low latency experience, providing seamless auto-failover for mission critical workloads and migration from on-premises to the cloud. One key use case where customers leverage Traffic Manager is to make their software deployments smoother with minimal impact to their users by implementing a Blue-Green deployment process using Traffic Manager’s weighted round-robin routing method. This blog will show how we can implement Blue-Green deployment using Traffic Manager, but before we dive deep, let us discuss what we mean by Blue-Green deployment.

Blue-Green deployment is a software rollout method that can reduce the impact of interruptions caused due to issues in the new version being deployed. This is achieved by exposing the new version of the software to a limited set of users and expanding that user base gradually until everyone is using the new version. If at any time the new version is causing issues, for example a broken authentication workflow in the new version of a web application, all the users can be instantly* redirected to the old version. 

This is achieved by running two matching virtual environments known as Blue and Green. Normally just one environment (Blue) serves all user traffic and the other environment (Green) is either absent or is idle. During a production deployment to the Green environment, Traffic Manager can be used to gradually send more and more users from Blue to Green while continuously testing the Green environment with live traffic.

For your workloads that are running in Azure, the recommendation is to setup the Blue environment, which has the old code, and Green environment, which has the new code, in separate Azure Resource Manager groups. If the endpoint is external, you can use any continuous integration and deployment tool to manage and deploy the two environments. Once you have the environments ready you can create a Traffic Manager profile using the Azure portal, PowerShell, or CLI, with weighted round-robin as the routing method and add the endpoints corresponding to these environments.

In this example we set the first endpoint Blue.contoso.com with a weight of 1,000 and the second endpoint Green.contoso.com with a weight of 1. This ensures that all traffic goes to Blue environment and no traffic is routed to the Green environment. This is the initial state of the Blue-Green deployment.

Resource group

Blue environment

All the traffic is sent to the Blue environment

At this point you should setup your client application to NOT connect directly to the endpoints, but to go through Traffic Manager by using the DNS name of the profile you had created (this DNS name will end with trafficmanager.net).

Pro tip: You can use Azure DNS to host your custom domain name and then point that to the Traffic Manager DNS name. This way your client applications and users can use an easy to remember name to access your service through Traffic Manager. You can find more information on how to do this in the Azure DNS documentation page.

Once you have updated the Green environment with the latest version you want to rollout, you can steadily increase the weightage of the Green environment so that a limited number of users are now being directed to there. You will then monitor, using Azure Monitor or Azure Network Watcher, the experience of those users to see if more users can be exposed to the Green environment or whether you need to stop sending users to it.

Blue-Green environment

A portion of traffic is sent to the Green environment

This process of measuring impact and changing the settings is a step function which can be automated with a continuous deployment pipeline (e.g. using Jenkins, Terraform, and Azure Resource Manager templates). If the deployment is a web application running on Azure Web Apps, then you can simply use Visual Studio to execute most of the Blue-Green deployment.

Process
 
You can keep going up or down in the weightage given to the Green environment until your software deployment is complete and all traffic is now directed to the Green environment running the new version of your software.

Green environment

All the traffic is sent to the Green environment

Azure Traffic Manager makes this process of safely deploying using the Blue-Green methodology seamless, granular, and fast acting*. Above all, you are in control here when it comes to deciding how fast or how slow you wish to ramp up the Green environment.

 

*The DNS response TTL value you have set in your profile affects how soon you can increase or decrease traffic to an endpoint. For more details, please visit the Azure Traffic Manager FAQ.

AI, Machine Learning and Data Science Roundup: May 2018

$
0
0

A monthly roundup of news about Artificial Intelligence, Machine Learning and Data Science. This is an eclectic collection of interesting blog posts, software announcements and data applications I've noted over the past month or so.

Open Source AI, ML & Data Science News

R 3.5.0 has been released, with significant performance improvements.

Project Jupyter has received the ACM Software System Award.

PyTorch 1.0 has been released, and deemed ready for both research and production for AI applications.

The ecosystem for ONNX (the open standard for exchange of neural network models) expands, with official support for Core ML, NVIDIA TensorRT 4, and the Snapdragon Neural Processing Engine.

A new Swift API for Tensorflow has been released, with compiler and language enhancements.

TensorFlow Probability, a probabilistic programming toolbox for machine learning.

Industry News

Impressive benchmarks for training Imagenet and CIFAR10 using AWS spot instances, with Pytorch and fastai.

Oracle acquires machine learning platform Datascience.com, a cloud workspace platform for data science projects and workloads.

Microsoft News

Microsoft Build featured several announcements regarding AI and machine learning. You can watch the keynotes here.

Many new capabilities and features for Cognitive Services were announced at BUILD, including: Bing Custom SearchCustom Vision Service and Custom Decision Service; Microsoft’s Cognitive Services Labs (with previews of emerging Cognitive Services technologies); and Video Indexer.

Cognitive Search, a new Azure service providing an AI-first approach to content understanding.

New Python packages for machine learning in Azure: Forecasting, Computer Vision, and Text Analytics.

ML.net is a new open-source, cross-platform machine learning framework for developers. The technology behind AI features in Office and Windows has now been released as a project on Github.

Project Brainwave is now in preview, bringing hardware-accelerated inference for AI to Azure. Deep neural networks encoded by FPGAs (field-programmable gate arrays) can dramatically reduce the time to classify images (as just one application example). ResNet50 is available in Project Brainwave now (subject to quota approval).

Microsoft and Qualcomm have partnered on the Vision AI Developer Kit, now in preview.

Updates to the Bot Builder SDK and Bot Framework Emulator.

Microsoft R Open 3.4.4 is now available.

Learning resources

Transfer Learning for Text using Deep Learning Virtual Machine (DLVM). Code comparing the performance of eight machine-comprehension algorithms is also available in Github.

The TWIML AI podcast series on differential privacy, a technique for collecting data in such a way that it reduces the impact on privacy even in the event of a leak.

Building Neural Networks with TensorFlow and the Azure Data Science Virtual Machine. This hands-on lab walks through the process of building an image recognizer using transfer learning with MobilenetV1.

Full-integrated experience simplifying Language Understanding in conversational AI systems.

Deep Learning Image Segmentation for Ecommerce Catalogue Visual Search, with details on removing the background from a product image using GrabCut and Tiramisu.

How to Develop a Currency Detection Model using Azure Machine Learning, with details on how the real-time banknote recognition capability of the Seeing AI application was implemented in CoreML.

Deep Learning for Emojis with VS Code Tools for AI: semantic analysis of text with emojis.

A maze-solving Minecraft robot, implemented in R.

Find previous editions of the monthly AI roundup here.

Get started with web push notifications

$
0
0

Microsoft Edge now supports web push notifications via the Push API, beginning with the Windows 10 April 2018 Update. Web push notifications allow you to build re-engaging experience by delivering timely, relevant notifications to users, even when your browser or web app is closed.

To help you get started with push messages and to demonstrate how they work across different browsers, we’re happy to introduce a new Microsoft Edge tutorial: Web Push Notifications.

Logo reading: "Web Push Notifications: Welcome to the future of the web - where push messages can help you achieve better engagement for your site or web app."

Push notifications expand your reach to your users in a timely, power-efficient and dependable way. Users are re-engaged with customized and relevant content that will have them coming back for more.

Our tutorial is inspired by astrology, drawing parallels between the idea of messages from the zodiac being stored in the cloud and the way the architecture of push notifications is structured. Take a look to learn to set up your site for push notifications – both the front-end and the back-end.

Illustration of the zodiac, with a PC screen superimposed showing a push notification of a zodiac reading.

As with all our demos, you can fork the tutorial itself on GitHub and even set it up locally for your own experimentation. To learn more about building with Push and Service Worker, check out Ali Alabbas’ session from Microsoft Build 2018: Building performant and re-engaging web apps with Service Worker.

To try out push notifications in Microsoft Edge, visit the tutorial and click on the button that says, “Initiate push”, accept the permission prompt, and immediately close the tab (or browser). You should get a notification within 5 seconds after clicking the button below. When you click the notification, it will take you right back to the web push tutorial, so you can continue learning about push messages.

Try it out and let us know what you think!

— Ali Alabbas, Program Manager, Microsoft Edge
Stephanie Drescher, Program Manager, Microsoft Edge

The post Get started with web push notifications appeared first on Microsoft Edge Dev Blog.

Accelerate data warehouse modernization with Informatica Intelligent Cloud Services for Azure

$
0
0

Today at the Informatica World, Scott Guthrie, EVP, Cloud + AI, along with Anil Chakravarthy, CEO of Informatica, announced the availability of Informatica Intelligent Cloud Services (IICS) for Azure. Microsoft has partnered with Informatica, a leader in Enterprise Data Management, to help our customers accelerate data warehouse modernization. This service is available as a free preview on Azure today.

Informatica provides a discovery-driven approach to data warehouse migration. This approach simplifies the process of identifying and moving data into Azure SQL Data Warehouse (SQL DW), Microsoft’s petabyte scale, fully managed, globally available analytics platform. With the recently released SQL DW Compute Optimized Gen2 tier, you can enjoy 5x performance, 4x concurrency and 5x scale from previous generation.

image

With this release, Informatica Intelligent Cloud Services for Azure can be launched directly from the Azure Portal. You can enjoy a single sign-on experience and don't have to create a separate Informatica account. With Informatica Data Accelerator for Azure, you can discover and load data into SQL DW. Informatica’s discovery-driven approach allows you to work with thousands of tables and columns.

“We are very excited about this next step in our long-standing partnership with Microsoft", said Pratik Parekh, VP, Product Management, Informatica. “This native integration of our industry leading iPaaS, Informatica Intelligent Cloud Services, on the Azure Portal demonstrates our joint engineering commitment. With this integration, customers can accelerate the migration of legacy data warehouse workloads to Azure SQL Data Warehouse.”​

You can further enrich your data with Informatica’s Enterprise Data Catalog that provides enterprise wide operationalizing and governing capabilities. You can perform impact analysis, discover data relationships, track lineage and schedule loading among other capabilities.

Informatica Intelligent Cloud Services together with Azure simplifies the experience for you to identify and migrate your on-premise data to a modern data warehouse. With this partnership, Microsoft and Informatica accelerate your journey to the cloud enabling you to leverage the scalability and flexibility offered by SQL DW.

Get started

blogpost_screenshot

Announcing ASP.NET Providers Connected Service Visual Studio Extension

$
0
0

Provider pattern was introduced in ASP.NET 2.0 and it gives the developers the flexibility of where to store the state of ASP.NET features (e.g. Session State, Membership, Output Cache etc.). In ASP.NET 4.6.2, we added async support for Session State Provider and Output Cache Provider.  These providers provide much better scalability, and enables the web application to adapt to the cloud environment.  Furthermore, , we also released SqlSessionStateProviderAsync, CosmosDBSessionStateProviderAsync, RedisSessionStateProvider and SQLAsyncOutputCacheProvider.  Through these providers the web applications can store the Session State in Azure resources like, SQL Azure, CosmosDB, and Redis Cache, and Output Cache in SQL Azure.  With these options, it may be not very straightforward to pick one and configure it right in the application.  Today we are releasing ASP.NET Providers Connected Service Visual Studio Extension to help you pick the right provider and configure it properly to work with Azure resources.  This extension will be your one-stop shop where you can install and configure all the ASP.NET providers that are Azure ready.

How to install the extension

The ASP.NET Providers Connected Service Extension can be installed on Visual Studio 2017. You can install it through Extensions and Updates in Visual Studio and type “ASP.NET Providers Connected Service” in the search box. Or you can download the extension from Visual Studio MarketPlace.

How to use the extension

To use the Extension, you need to make sure that your web application targets to .NET Framework 4.6.2 or higher.  You can open the extension through right clicking on the project, selecting Add and clicking on Connected Service. You will see all the Connected Services installed on your VS which apply to your project.

After clicking on Microsoft ASP.NET Providers extension. You will see the following wizard window, you can choose the provider you want to install and configure for your ASP.NET web application. Currently we have two sets of providers, Session State providers and Output Cache provider.

Select a provider and click on the Next button. You will see a list of providers that apply to your application, which connects with Azure resources. Currently we have SQL SessionState provider, CosmosDB SessionState provider, RedisCache Sessionstate provider and SQL OutputCache provider.

After the provider is chosen, the wizard window will lead you to select an Azure instance which will be used by the provider selected.  In order to fetch the Azure instances that apply to the selected provider, you will need to sign in with your account in Visual Studio.   Then Select an Azure instance and click on the Finish button, the extension will install the relevant Nuget packages and update the web.config file to connect the provider with that selected Azure instance.

Things to be aware of

  1. If the application is already configured with a provider and you want to install a same type of provider, you need to remove that provider first. E.g. your application is using SQL SessionState provider and you want to switch to CosmosDB SessionState provider. In this case, you need to remove the SessionState Provider settings in the web.config, then you can use ASP.NET Providers Connected Services to install and configure the CosmosDB SessionState provider.
  2. If you are installing Async SQL SessionState provider or Async SQL OutputCache provider, you need to replace the user name and password in the connection string in web.config added by ASP.NET Providers Connected Services. As you may have multiple accounts in your Azure SQL Database instance.

Summary

ASP.NET Providers Connected Services helps you install and configure ASP.NET providers for your web application to consume Azure services. Our goal of this Visual Studio extension is to make it easier and provide a central place to help you configure different providers for the ASP.NET web applications and connect your web applications with Azure. Please install the extension from Visual Studio Marketplace today and let us know your feedback.

Viewing all 10804 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>