Quantcast
Channel: Category Name
Viewing all 10804 articles
Browse latest View live

Announcing .NET Framework 4.8 Early Access build 3621!

$
0
0

Today, we are happy to share the .NET Framework 4.8 Early Access build 3621 for your feedback. This is one of the in-development builds of the next version of the .NET Framework. The changes in this build have been functionally validated by the .NET teams. We would love your help to ensure this is a high quality and compatible release. This build is not supported for production use.

Next steps:

This pre-release build 3621 includes improvements in several areas. This includes accessibility improvements in Winforms and WPF areas as well. You can see the complete list of improvements in this build in the release notes.

The .NET Framework build 3621 will replace any existing .NET Framework 4 and later installation on your machine. This means all .NET Framework 4 and later applications on your machine will run on the .NET Framework early access builds upon installation.

The .NET Framework build 3621 installs on the following Windows and Windows Server versions:

  • Windows 10 April 2018 Update
  • Windows 10 Fall Creators Update
  • Windows 10 Creators Update
  • Windows 10 Anniversary Update
  • Windows Server, version 1709
  • Windows Server 2016

We will continue sharing early builds of the next release of the .NET Framework via the Early Access Program on a regular basis for your feedback. As a member of the .NET Framework Early Access community you will have an active role in helping us build new .NET Framework products. We will do our best to ensure these early access builds are stable and compatible, but you may see bugs or issues from time to time. We’d appreciate you taking the time to report these to us on Github so we can address these issues before the official release. Thank you.


Using STMicroelectronics starter kits to connect to Azure IoT in minutes

$
0
0

Microsoft partners with silicon vendors such as STMicroelectronics to simplify and accelerate the development of embedded systems, so our customers can move projects from proof of concepts to production faster. One of the most common issues in IoT project development is the passage from proof of concept to production, from a handful of devices to deployment and management of devices at an IoT scale and from development hardware to mass produced silicon.

STMicroelectronics offers a wide range of IoT hardware along with pre-integrated software, a powerful development ecosystem and valuable starter kits. With these, connecting to Azure IoT Hub the cloud platform, monitor, and manage billions of IoT assets using one of the Microsoft Azure Certified ST devices takes minutes and you don’t have to write any code! This magic is possible because of the integration ST provides under the cover. For example, STM32 IoT Discovery Kit Node is an Arm® Cortex®-M4-core-based developer kit and sporting a full set of low power wireless connectivity options and environmental, motion and ranging sensors. FP-CLD-AZURE1 is an STM32Cube function pack that ST developed for this kit. Azure IoT C SDK is integrated into the middleware of this function pack, which enables direct and secure connectivity to Azure IoT Hub, as well as full support for Azure device management. Furthermore, the platform specific adaptors which plug into the C SDK are available in source code. Developers can leverage this port across different MCU families using the hardware abstraction layer and STM32Cube.

To walk through the developer experience, we invited Manuel Cantone from STMicroelectronics to the Internet of Things Show on Channel 9. So let’s hear from Manuel himself on connecting the STM32 to the cloud via Azure!

8 reasons to choose Azure Stream Analytics for real-time data processing

$
0
0

Processing big data in real-time is now an operational necessity for many businesses. Azure Stream Analytics is Microsoft’s serverless real-time analytics offering for complex event processing. It enables customers to unlock valuable insights and gain competitive advantage by harnessing the power of big data. Here are eight reasons why you should choose ASA for real-time analytics.

1. Fully integrated with Azure ecosystem: Build powerful pipelines with few clicks

Whether you have millions of IoT devices streaming data to Azure IoT Hub or have apps sending critical telemetry events to Azure Event Hubs, it only takes a few clicks to connect multiple sources and sinks to create an end-to-end pipeline. Azure Stream Analytics provides best-in-class integration to store your output, like Azure SQL Database, Azure Cosmos DB, Azure Data Lake Store. It also enables you to trigger custom workflows downstream with Azure Functions, Azure Service Bus Queues, Azure Service Bus Topics, or create real-time dashboards using Power BI.

2. Developer productivity

One of the biggest advantages of Stream Analytics is the simple SQL-based query language with its powerful temporal constraints to analyze data in motion. Familiarity with SQL language is sufficient to author powerful queries. Azure Stream Analytics supports language extensibility via JavaScript user-defined functions (UDFs) or user-defined aggregates to perform complex calculations as part of a Stream Analytics query. With Stream Analytics Visual Studio tools you may author queries offline and use CI/CD to submit jobs to Azure.  Native support for geospatial functions makes it easy to tackle complex scenarios like fleet management and mobile asset tracking.

3. Intelligent edge

Most data becomes useless just seconds after it’s generated. In many cases, processing data closer to the point of generation is becoming more and more critical. This allows for lower bandwidth costs and ability of a system to function even with intermittent connectivity. Azure Stream Analytics on IoT Edge enables you to deploy real-time analytics closer to IoT devices so that you can unlock the full value of device-generated data.

4. Easily leverage the power of machine learning

Azure Stream Analytics offers real-time event scoring by integrating with Azure Machine Learning solutions. Additionally, Stream Analytics offers built-in support for commonly used scenarios such as anomaly detection, which helps reduce the complexity associated with building and deploying an ML model in your hot-path analytics pipeline to a simple function call. Users can easily detect common anomalies such as spikes, dips, slow positive or negative trends with these online learning and scoring models.

5. Lower your cost of innovation

There are zero upfront costs and you only pay for the number of streaming units you consume to process your data streams. There is absolutely no commitment or cluster provisioning allowing you to focus on making best use of this technology.6.

6. Best-in-class financially backed SLA by the minute

We understand it is critical for business to prevent data loss and have business continuity. Stream Analytics processes millions of events every second and can deliver results with low latency. This is why Stream Analytics guarantees event processing with a 99.9 percent availability SLA at the minute level, which is unparalleled in the industry.

7. Scale instantly

Stream Analytics is a fully managed serverless (PaaS) offering on Azure. There is no infrastructure to worry about, and no servers, virtual machines, or clusters to manage. We do all the heavy lifting for you in the background. You can instantly scale up or scale-out the processing power from one to hundreds of streaming units for any job.

8. Reliable

Stream Analytics guarantees “exactly once” event processing and at least once delivery of events. It has built-in recovery capabilities in case the delivery of an event fails. So, you never have to worry about your events getting dropped.

Getting started is easy

There is a strong and growing developer community that supports Stream Analytics. We at Microsoft are committed to improving Stream Analytics and always listening for your feedback to improve our product! Learn how to get started and build a real-time fraud detection system.

Process more files than ever and use Parquet with Azure Data Lake Analytics

$
0
0

Azure Data Lake Analytics (ADLA) is a serverless PaaS service in Azure to prepare and transform large amounts of data stored in Azure Data Lake Store or Azure Blob Storage at unparalleled scale.

ADLA now offers some new, unparalleled capabilities for processing files of any formats including Parquet at tremendous scale.

Previously: Handling tens of thousands of files is painful!

Many of our customers tell us that handling a large number of files is challenging – if not downright painful in all the big data systems that they have tried. Figure 1 shows the distribution of files in common data lake systems. Most files are less than one GB, although a few may be huge.

1636-1

Figure 1: The pain of many small files

ADLA has been developed from a system that was originally designed to operate on very large files that have internal structure that help with scale-out, but it only operated on a couple of hundred to about 3,000 files. It also over-allocated resources when processing small files by giving one extract vertex to a file (a vertex is a compute container that will execute a specific part of the script on a partition of the data and take time to create and destroy). Thus, like other analytics engines, it was not well-equipped to handle the common case of many small files.

Process hundreds of thousands of files in a single U-SQL job!

With the recent release, ADLA takes the capability to process large amounts of files of many different formats to the next level.
   1636-2

Figure 2: Improving the handling of many small files

ADLA increases the scale limit to schematize and process several hundred thousand files in a single U-SQL job using so-called file sets. In addition, we have improved the handling of many small files (see Figure 2) by grouping up to 200 files of at most 1 GB of data into a single vertex (in preview). Note that the extractor will still handle one file at a time, but now the vertex creation is amortized across more data and the system uses considerably fewer resources (Analytics Units), and costs less, during the extraction phase.

Figure 3 shows on the left how a job spent about 5 seconds to create a vertex and then processed the data in less than 1 second and used many more vertices. In the image on the right, with the improved files sets handling, it was able to spend much fewer resources (AUs) and made better use of the allocated resources.

1636-3


Figure 3: Job Vertex usage when extracting from many small files: without file grouping and with file grouping

Customers who use these file set improvements have seen up to a 10-fold performance improvement in their jobs resulting in a much lower cost, and they can process many more files in a single job!

We still recommend though to maximize the size of the files for processing, because the system will continue to be even better with large files.

Plus today, ADLA now natively supports Parquet files

U-SQL offers both built-in native extractors to schematize files and outputters to write data back into files, as well as the ability for users to add their own extractors. In this latest release, ADLA adds a public preview of the native extractor and outputter for the popular Parquet file format and a “private” preview for ORC, making it easy to both consume and produce these popular data formats at large scale. This provides a more efficient way to exchange data with the open source Big Data Analytics solutions such as HDInsight’s Hive LLAP and Azure DataBricks than previously using CSV formats.

Head over to our Azure Data Lake Blog to see an end-to-end example of how we put this all together to cook a 3 TB file into 10,000 Parquet files and then process them both with the new file set scalability in U-SQL and query them with Azure Databricks’ Spark.

But wait, there’s more!

There are many addition new features such as  a preview of dynamically partitioned file generation (the user voice top ask!) a new AU modeler to optimize your jobs cost/performance trade-off, the ability to extract from files that use the Windows code pages, the ability to augment your script with job information through @JobInfo, and even a light-weight, self-contained script development with in-script C# named lambdas and script-scoped U-SQL objects. Go to the spring release notes to see all of the newly introduced capabilities.

Azure Backup for SQL Server on Azure now in public preview

$
0
0

Earlier this week, Corey Sanders announced preview of a new Azure Backup capability to backup SQL workloads running in Azure Virtual Machines in his post about why you should bet on azure for your infrastructure needs today and in the future. In this blog, we will elaborate on how this enterprise backup provides a new breakthrough in backup that differentiates Azure from any other public cloud. This workload backup capability is built as an infrastructure-less, Pay aA You Go (PAYG) service that leverages native SQL backup and restore APIs to provide a comprehensive solution to backup SQL servers running in Azure IaaS VMs.

Azure Backup for SQL

Key benefits

  • Zero-infrastructure backup: Freedom from managing backup infrastructure (e.g. backup server, agents or backup storage) or writing complex backup scripts.
  • Centrally manage and monitor all backups using Recovery Services Vault:
    • Create policies to specify the backup schedule and retention for both short-term and long-term retention needs using Grandfather-father-son style retention schemes. Re-use these policies across multiple databases across servers.
    • Configure email notification for any backup or restore failure.
    • Monitor the backup jobs using Recovery Services Vault dashboard for all workloads including Azure IaaS VMs, Azure Files and SQL server databases.
  • Restore to any time, up to a specific second: Restore databases to any date and time up to a specific second. Azure Backup provides a graphical overview of the recovery point availability for the selected date, which will help users choose the right recovery time. In the backend, the solution will figure out the appropriate full, differential and series of log backup chain corresponding to the selected time that need to be restored.
  • 15-minute Recovery Point Objective (RPO): Configure transaction log backup every 15 minutes to meet the backup SLAs needs of the organization.
  • PAYG Service: No upfront payment is needed. Billing is based on consumption each month.
  • Native SQL API integration: Azure Backup uses native SQL APIs such that customers get the benefit of SQL backup compression, full fidelity backup and restore including full, differential and log backups. Customers can monitor their backup jobs using SSMS.
  • Support for Always On Availability Group: Azure Backup protects databases in an Availability Group such that data protection continues seamlessly even post failover while honoring the Availability Group backup preference.

Get started

The video below walks you through various steps on how to configure backup for your SQL Servers running in IaaS VM. You can refer to documentation for more details.

Upcoming planned enhancements

Below are some of the key features planned for general availability and we plan to stage them through the rest of the year. Please follow us on @AzureBackup and look out for a twitter poll shortly to share your feedback on them.

  • Central customizable backup reports using Power BI.
  • Central customizable monitoring using OMS Log Analytics.
  • Automatically protection future added databases (auto-protect).
  • Support for PowerShell and Azure CLI.

Additional resources

Spring 2018 Visual C++ Migration Survey

$
0
0

The Spring 2018 Visual C++ Migration Survey is now open.

Please take a few minutes to share your experiences, positive or not so positive. If you have not migrated your solutions and project to Visual Studio 2017, please let us know why. This survey is a way for us to better understand your migration issues  , your needs and provide the features  you need.

Share your candid feedback now!

Thank you.

 

Case Study: How to use Distance Matrix API for Commute Analysis and Office Site Planning

$
0
0

Bing Maps Distance Matrix API helps customers solve a broad range of business problems related to travel routes and times, for example, local data analysis, vehicle asset management, delivery routing optimization services, travelling salesman problem, real estate site planning, sorting in job search sites, commute planning, and more.

Earlier this year, a Real Estate and Facilities (RE&F) team with a Microsoft subsidiary used Bing Maps Distance Matrix API, as the specialty tool to perform a quick commute analysis, to help determine the single best location for collocating all of their operations from multiple offices in a metro area (hereafter referred to as “Region X”) to one office location.

“As we were in process of collocating a new site, finding the right location for our colleagues was essential. We wanted to make it more accessible and to have a better commute for all our staff. Using the Bing Maps Distance Matrix API, we were able to quickly determine with HIGH ACCURACY where they live, what is the commute time currently, and what will be the commute time to the proposed locations. GREAT HELP in narrowing the options.  Therefore, GREAT TOOL with GREAT RESULTS,” said RE&F team Facilities Operations Manager.

In this blog post we share in detail how the RE&F team used the Bing Maps Distance Matrix API to gather the necessary data and insight to help them make the best decision possible for their organization.

Business problem

The Microsoft subsidiary is expanding into Region X with a plan to renovate their buildings and facilities. Their RE&F team investigated available options in the real estate market to collocate all operations from multiple locations within the target region into a single location, but they are having a hard time deciding upon a location.

Various factors are being considered with one of the most critical being employee commute time. The RE&F team wants to make sure that they factor in how much time each employee spends commuting via public transportation or vehicle and identify a site that would be most easily accessible (for all the employees). A commute study was needed to investigate how employees could decrease their average commuting time. Yet given the large number of the employees and candidate sites, analyzing the various combinations of commute scenarios was a challenge.

The study outlined below illustrates how the RE&F team met the challenge and analyzed commute times, taking into consideration the multiple current office locations situated in different areas of Region X versus the potential future office locations.

Analysis and solution using Distance Matrix API

The Bing Maps Distance Matrix API proved to be the perfect tool for RE&F team’s commute study. They first needed to understand where within Region X their employees are living. After running some HR reports, the RE&F team discovered that a large portion of the employees reside in cities outside of Region X. Additionally, each employee completed an anonymous survey collecting details about the street name and street number of their residence in addition to the site where they work.

Using Bing Maps Distance Matrix API, the RE&F team was able to calculate the average time spent in traffic from each employee’s home address to the office. They then calculated the average difference in travel time during rush hours. With all of this data in hand, using Distance Matrix API, the team quickly calculated the average time spent in traffic by all the employees commuting to the current location compared with the average time that would be spent in traffic travelling to the new sites.

The results were then analyzed with and mapped on the site locations, with the goal of finding the best location based on ease of access. An example table of the average commute time delta comparing all candidate sites is illustrated below:

Table 1. Average employee commute time to each site vs. average of all sites

Site #

Average commute time delta (minutes)

1

- 18

2

- 11

3

- 8

X

+ 19


Employee Commute Analysis

Figure 1. Illustration of employees’ commute analysis to multiple locations using Distance Matrix. (Note: Blue dots show the distribution of employee home locations)

The table above demonstrates a simplified approach to solving the problem by comparing average commute times. In that instance Site# 1 would be the best location because it has the shortest average commute time. However, the RE&F team went a step further and took the impact of employee distribution into account, by analyzing and modeling each employee’s commute time. Bing Maps Distance Matrix API powered the calculations based on road data, commute routes, and real-time traffic data.

After that, the RE&F team broke down the potential impact of each new site on commute time into three levels, with respect to each employee’s current office location:

1. New commute time that would be shorter due to potential relocation [This is a benefit]

2. New commute time that would increase but by less than 30 minutes [This has a mild negative impact]

3. New commute time that would increase by more than 30 minutes [This is a high-risk choice]

The table below summarizes the percentage of affected employees in each of the three impact levels.

Table 2. Key findings in driving mode using Distance Matrix API

Candidate Site

% of employees

[new commute time is shorter]

% of employees

[new commute time increases but by less than 30mins]

% of employees

[new commute time increases by more than 30min]

1

60%

40%

0%

2

70%

30%

0%

3

47%

53%

 

4

51%

47%

2%

X

Distance Matrix API supported analysis for both driving and public transportation.

In the driving case, take Table 2 as an example, candidate Site# 2 would be the overall winner in terms of positive impact: the highest percentage (70%) of employees will benefit from a shorter commute. That said, all candidate sites are fair choices with mild increases in commute time of less than 30 minutes. Only 0% - 2% of employees will see more than a 30-minute commute time increase.

Table 3. Key Findings in public transport (i.e. transit) mode using Distance Matrix API

Candidate sites

% of employees

[new commute time equal or shorter]

% of employees

[new commute time increases but by less than 30mins]

% of employees

[new commute time increases by more than 30min]

1

45%

38%

17%

2

57%

35%

8%

3

22%

68%

9%

4

69%

19%

12%

X

In the public transportation case, take Table 3 as an example, Candidate Site# 4 would be the overall winner in terms of positive impact: the highest percentage (69%) of employees will benefit from a shorter commute. Candidate Site# 2 is also a fair choice, as it benefits 57% of employees while having lowest percentages of employees who will see more than a 30-minute commute time increase. By contrast, candidate Site# 1 would have the highest percentage (17%) of employees that run a high risk of a commute time increase over 30 minutes. In summary, candidate Site# 4 and candidate Site# 2 would be fair options.

As illustrated in the examples of the analysis above, the Bing Maps Distance Matrix API ultimately helped the RE&F team make the best decision in selecting a future office site with a quick and big positive impact on the overall results.

To get more information, data, and examples of the Distance Matrix API in action, see the documentation and visit www.microsoft.com/maps.

For any questions about other Bing Maps API use cases, contact the Bing Maps team at bmesupp@microsoft.com.

Simplifying IT with the latest updates from Windows Autopilot

$
0
0

With Windows Autopilot, our goal is to simplify deployment of new Windows 10 devices by eliminating the cost and complexity associated with creating, maintaining, and loading custom images. Windows Autopilot will revolutionize how new devices get deployed in your organization—now you can deliver new off-the-shelf Windows 10 devices directly to your users. With a few simple clicks, the device transforms itself into a fully business-ready state, dramatically reducing the time it takes for your users to get up and running with new devices.

Not only does Windows Autopilot significantly reduce the cost of deploying Windows 10 devices but also delivers an experience that’s magical for users and zero-touch for IT.

I’m excited to share that we are extending that zero-touch experience even further with several new capabilities available in preview with the Windows Insider Program today.

  • Self-Deploying mode—Currently, the Windows Autopilot experience requires the user to select basic settings like Region, Language, and Keyboard, and also enter their credentials, in the Windows 10 out-of-the-box experience. With a new Windows Autopilot capability called “Self-Deploying mode,” we’re extending the zero-touch experience from IT to the user deploying the device. Power on* is all it takes to deploy a new Windows 10 device into a fully business-ready state—managed, secured, and ready for usage—no need for any user interaction. You can configure the device to self-deploy into a locked down kiosk, a digital signage, or a shared productivity device—all it takes is power on.*
  • Windows Autopilot reset—This feature extends the zero-touch experience from deployment of new Windows 10 devices to reset scenarios where a device is being repurposed for a new user. We’re making it possible to completely reset and redeploy an Intune-managed Windows 10 device into a fully business-ready state without having to physically access the device. All you need to do is click a button in Intune!

Windows Insiders can test these features with the latest Windows 10 build and Microsoft Intune now.

I cannot wait to see the feedback from the Insider community! To see how this works, and several exciting updates to Windows Autopilot, check out this quick video:

 Source video.

You can head over to the Windows IT Pro blog right now for further details.

One final note: A big part of what we build is based on feedback from our customers. With this in mind, we also added several new Windows Autopilot capabilities into the Windows 10 April 2018 Update (version 1803) based on feedback, and these capabilities are also available today:

  • Enrollment Status page—We received tons of feedback from Windows Autopilot customers who want the ability to hold the device in the out-of-box setup experience until the configured policies and apps have been provisioned to the device. This enables IT admins to be assured the device is configured into a fully business-ready state prior to users getting to the desktop. This is made possible with a capability called “Enrollment Status” and is available today with Windows 10 April 2018 Update (version 1803) and Microsoft Intune.
  • Device vendor supply chain integration—We enabled Windows 10 OEMs and hardware vendors to integrate Windows Autopilot into their supply chain and fulfillment systems so that devices are registered in Windows Autopilot to your organization the moment your purchase is fulfilled. This makes the registration of Windows Autopilot devices completely hands-free and zero-touch for you as well as your device vendor/OEM. Contact your device reseller to find out if they are supporting Windows Autopilot.
  • Automatic Windows Autopilot profile assignment—We integrated Azure Active Directory (AD) dynamic groups with Windows Autopilot and Microsoft Intune to deliver a zero-touch experience for Windows Autopilot profile assignments on all Windows Autopilot devices.

I said this in my prior post and I’ll say it again—Windows Autopilot is an absolute game changer. I urge you to spend some time learning more about it.

To learn more about how to use Windows Autopilot and Co-Management together, check out this quick video.

*Requires network connection and TPM2.0.

The post Simplifying IT with the latest updates from Windows Autopilot appeared first on Microsoft 365 Blog.


LinkedIn Announces New Commute Feature Powered by Bing Maps

$
0
0

Today, LinkedIn introduced a new, helpful feature to their many users – “Your Commute.”

LinkedIn has been working with the Bing Maps Platform Team to add this valuable new feature to their mobile experience, allowing users to see what the commute time is for a selected job posting. Built with the help of several Bing Maps APIs, the “Your Commute” feature helps users calculate their potential commute time to the new job they are considering. The Bing Maps services used in the feature include Autosuggest, Geocoding, Routing and Maps.

Because Bing Maps supports multiple travel modes, the commute time from your location to the potential job can be calculated if you are travelling by car, using public transportation, or even walking.  Utilizing predictive traffic data also takes into account rush hour traffic, giving job seekers another data point to consider when selecting their next job.

To learn more about the “Your Commute” feature, read the announcement on the LinkedIn blog and an overview of the feature on TechCrunch.

For more information about the Bing Maps Platform APIs that help power this LinkedIn feature, go to https://www.microsoft.com/maps.

- Bing Maps Team

Blazor 0.4.0 experimental release now available

$
0
0

Blazor 0.4.0 is now available! This release includes important bug fixes and several new feature enhancements.

New features in Blazor 0.4.0 (details below):

  • Add event payloads for common event types
  • Use camelCase for JSON handling
  • Automatic import of core Blazor namespaces in Razor
  • Send and receive binary HTTP content using HttpClient
  • Templates run on IIS Express by default with autobuild enabled
  • Bind to numeric types
  • JavaScript interop improvements

A full list of the changes in this release can be found in the Blazor 0.4.0 release notes.

Get Blazor 0.4.0

To get setup with Blazor 0.4.0:

  1. Install the .NET Core 2.1 SDK (2.1.300 or later).
  2. Install Visual Studio 2017 (15.7) with the ASP.NET and web development workload selected.
    • Note: The Blazor tooling isn’t currently compatible with the VS2017 preview channel (15.8). This will be addressed in a future Blazor release.
  3. Install the latest Blazor Language Services extension from the Visual Studio Marketplace.

To install the Blazor templates on the command-line:

dotnet new -i Microsoft.AspNetCore.Blazor.Templates

You can find getting started instructions, docs, and tutorials for Blazor at https://blazor.net.

Upgrade an existing project to Blazor 0.4.0

To upgrade an existing Blazor project from 0.3.0 to 0.4.0:

  • Install all of the required bits listed above.
  • Update your Blazor package and .NET CLI tool references to 0.4.0.

Your upgraded Blazor project file should look like this:

<Project Sdk="Microsoft.NET.Sdk.Web">

  <PropertyGroup>
    <TargetFramework>netstandard2.1</TargetFramework>
    <RunCommand>dotnet</RunCommand>
    <RunArguments>blazor serve</RunArguments>
    <LangVersion>7.3</LangVersion>
  </PropertyGroup>

  <ItemGroup>
    <PackageReference Include="Microsoft.AspNetCore.Blazor.Browser" Version="0.4.0" />
    <PackageReference Include="Microsoft.AspNetCore.Blazor.Build" Version="0.4.0" />
    <DotNetCliToolReference Include="Microsoft.AspNetCore.Blazor.Cli" Version="0.4.0" />
  </ItemGroup>

</Project>

Event payloads for common event types

This release adds payloads for the following event types:

Event arguments Events
UIMouseEventArgs onmouseover, onmouseout, onmousemove, onmousedown, onmouseup, oncontextmenu
UIDragEventArgs ondrag, ondragend, ondragenter, ondragleave, ondragover, ondragstart, ondrop
UIPointerEventArgs gotpointercapture, lostpointercapture, pointercancel, pointerdown, pointerenter, pointerleave, pointermove, pointerout, pointerover, pointerup
UITouchEventArgs ontouchcancel, ontouchend, ontouchmove, ontouchstart, ontouchenter, ontouchleave
UIWheelEventArgs onwheel, onmousewheel
UIKeyboardEventArgs onkeydown, onkeyup
UIKeyboardEventArgs onkeydown, onkeyup, onkeypress
UIProgressEventArgs onloadstart, ontimeout, onabort, onload, onloadend, onprogress, onerror

Thank you to Gutemberg Ribeiro (galvesribeiro) for this contribution! If you haven't checked out Gutemberg's handy collection of Blazor extensions they are definitely worth a look.

Use camelCase for JSON handling

The Blazor JSON helpers and utilities now use camelCase by default. .NET objects serialized to JSON are serialized using camelCase for the member names. On deserialization a case-insensitive match is used. The casing of dictionary keys is preserved.

Automatic import of core for Blazor namespaces in Razor

Blazor now automatically imports the Microsoft.AspNetCore.Blazor and Microsoft.AspNetCore.Blazor.Components namespaces in Razor files, so you don't need to add @using statements for them. One less thing for you to do!

Send and receive binary HTTP content using HttpClient

You can now use HttpClient to send and receive binary data from a Blazor app (previously you could only handle text content). Thank you Robin Sue (Suchiman) for this contribution!

Bind to numeric types

Binding now works with numeric types: long, float, double, decimal. Thanks again to Robin Sue (Suchiman) for this contribution!

Templates run on IIS Express by default with autobuild enabled

The Blazor project templates are now setup to run on IIS Express by default, while still preserving autobuild support.

JavaScript interop improvements

Call async JavaScript functions from .NET

With Blazor 0.4.0 you can now call and await registered JavaScript async functions like you would an async .NET method using the new RegisteredFunction.InvokeAsync method. For example, you can register an async JavaScript function so it can be invoked from your Blazor app like this:

Blazor.registerFunction('BlazorLib1.DelayedText', function (text) {
    // Wait 1 sec and then return the specified text
    return new Promise((resolve, reject) => {
        setTimeout(() => {
            resolve(text);
        }, 1000);
    });
});

You then invoke this async JavaScript function using InvokeAsync like this:

public static class ExampleJSInterop
{
    public static Task<string> DelayedText(string text)
    {
        return RegisteredFunction.InvokeAsync<string>("BlazorLib1.DelayedText", text);
    }
}

Now you can await the async JavaScript function like you would any normal C# async method:

var text = await ExampleJSInterop.DelayedText("See ya in 1 sec!");

Call .NET methods from JavaScript

Blazor 0.4.0 makes it easy to call sync and async .NET methods from JavaScript. For example, you might call back into .NET when a JavaScript callback is triggered. While calling into .NET from JavaScript was possible with earlier Blazor releases the pattern was low-level and difficult to use. Blazor 0.4.0 provides simpler pattern with the new Blazor.invokeDotNetMethod and Blazor.invokeDotNetMethodAsync functions.

To invoke a .NET method from JavaScript the target .NET method must meet the following criteria:

  • Static
  • Non-generic
  • No overloads
  • Concrete JSON serializable parameter types

For example, let's say you wanted to invoke the following .NET method when a timeout is triggered:

namespace Alerts
{
    public class Timeout
    {
        public static void TimeoutCallback()
        {
            Console.WriteLine('Timeout triggered!');
        }
    }
}

You can call this .NET method from JavaScript using Blazor.invokeDotNetMethod like this:

Blazor.invokeDotNetMethod({
    type: {
        assembly: 'MyTimeoutAssembly',
        name: 'Alerts.Timeout'
    },
    method: {
        name: 'TimeoutCallback'
    }
})

When invoking an async .NET method from JavaScript if the .NET method returns a task, then the JavaScript invokeDotNetMethodAsync function will return a Promise that completes with the task result (so JavaScript/TypeScript can also use await on it).

Summary

We hope you enjoy this latest preview of Blazor. Your feedback is especially important to us during this experimental phase for Blazor. If you run into issues or have questions while trying out Blazor please file issues on GitHub. You can also chat with us and the Blazor community on Gitter if you get stuck or to share how Blazor is working for you. After you've tried out Blazor for a while please also let us know what you think by taking our in-product survey. Just click the survey link shown on the app home page when running one of the Blazor project templates:

Blazor survey

Thanks for trying out Blazor!

In case you missed it: May 2018 roundup

$
0
0

In case you missed them, here are some articles from April of particular interest to R users.

The R Consortium has announced a new round of grants for projects proposed by the R community.

A look back at the ROpenSci unconference held in Seattle. 

Video of my European R Users Meeting talk, "Speeding up R with Parallel Programming in the Cloud".

Slides from my talk at the Microsoft Build conference, "Open-Source Machine Learning in Azure".

Discussions on Twitter: R packages by stage of data analysis; thinking differently about AI development; and, why is package management harder in Python than R? 

Our May 2018 roundup of AI and data science news.

Panelist Francesca Lazzeri reviews the Mind Bytes AI conference in Chicago.

And some general interest stories (not necessarily related to R):

As always, thanks for the comments and please send any suggestions to me at davidsmi@microsoft.com. Don't forget you can follow the blog using an RSS reader, via email using blogtrottr, or by following me on Twitter (I'm @revodavid). You can find roundups of previous months here.

Preview of Visual Studio Kubernetes Tools

$
0
0

Kubernetes is an open source system that is quickly emerging as the preferred container orchestration system for applications of all shapes and sizes, simplifying the deployment, scaling, and operations of application containers. It runs in a variety of environments, including on premises as well as in cloud providers, such as Microsoft’s own Azure Kubernetes Service (AKS).

Here on the Visual Studio team, we are working on ways to better support developers who are building containerized applications that target Kubernetes. In talking to these developers, we’ve heard that it can be challenging to create Dockerfiles, Helm charts, and other configuration-as-code files required to create container images and deploy them to Kubernetes. And taking your code from Visual Studio to your Kubernetes cluster requires memorizing some pretty complicated CLI commands.

We now have a preview available of the first version of the Visual Studio Kubernetes Tools, which aims to simplify the Kubernetes experience for Visual Studio developers. Please note there are some pre-requisites to using these tools, check out the detailed tutorial for complete instructions and guidance.

With the tools installed, you can create a new “Container Application for Kubernetes” project, or add Kubernetes support to an existing .NET Core web application. When you do this, Visual Studio will automatically create a Dockerfile and a Helm chart for your project. You can easily create a container image to run your application, or use these files to deploy to any Kubernetes cluster. These tools will also integrate with Azure Dev Spaces, which provides a rapid, iterative development experience right in Azure Kubernetes Service.

Solution Explorer

Users of Azure Kubernetes Service can also deploy to an AKS cluster directly from Visual Studio, via the new “Publish to Azure AKS” option.

Publish to Azure AKS option in Solution Explorer

If you’re interested in trying out the Visual Studio Kubernetes Tools, please provide your email address on this form. We’ll follow up with a more detailed tutorial and a link to download the tools. Have questions or feedback? You can reach us at vsk8stools@microsoft.com.

Lisa Guthrie, Program Manager, Azure Developer Experience

Lisa has been at Microsoft since 2001, working as a developer support engineer, a Premier Field Engineer, and most recently as a PM. She currently works on container-related tooling for Visual Studio, including Docker, Kubernetes, and Azure Dev Spaces.

Microsoft R Open 3.5.0 now available

$
0
0

Microsoft R Open 3.5.0 is now available for download for Windows, Mac and Linux. This update includes the open-source R 3.5.0 engine, which is a major update with many new capabilities and improvements to R. In particular, it includes a major new framework for handling data in R, with some major behind-the-scenes performance and memory-use benefits (and with further improvements expected in the future).

Microsoft R Open 3.5.0 points to a fixed CRAN snapshot taken on June 1 2018. This provides a reproducible experience when installing CRAN packages by default, but you always change the default CRAN repository or the built-in checkpoint package to access snapshots of packages from an earlier or later date.

Relatedly, many new packages have been released since the last release of Microsoft R Open, and you can browse a curated list of some interesting ones on the Microsoft R Open Package Spotlight page.

We hope you find Microsoft R Open useful, and if you have any comments or questions please visit the Microsoft R Open forum. You can follow the development of Microsoft R Open at the MRO Github repository. To download Microsoft R Open, simply follow the link below.

MRAN: Download Microsoft R Open

Because it’s Friday: Sealese

$
0
0

If you've ever wondered what the animals are saying in those cute animal videos, fear not: YouTube has now apparently added captions for animal vocalizations, which makes everything clear:

That's all for this week! Have a good weekend, and we'll be back next week.

Top stories from the VSTS community – 2018.06.08

$
0
0
In case you’ve just gotten back from a backpacking adventure where you had no internet, no telephones and no homing pigeons, then you may have missed out on the big news this week. Microsoft has agreed to acquire GitHub! It’s hard to imagine anything bigger, so this week we have a roundup of the top... Read More

ASP.NET Core Architect David Fowler’s hidden gems in 2.1

$
0
0

ASP.NET Architect David FowlerOpen source ASP.NET Core 2.1 is out, and Architect David Fowler took to twitter to share some hidden gems that not everyone knows about. Sure, it's faster, builds faster, runs faster, but there's a number of details and fun advanced techniques that are worth a closer look at.

.NET Generic Host

ASP.NET Core introduced a new hosting model. .NET apps configure and launch a host.

The host is responsible for app startup and lifetime management. The goal of the Generic Host is to decouple the HTTP pipeline from the Web Host API to enable a wider array of host scenarios. Messaging, background tasks, and other non-HTTP workloads based on the Generic Host benefit from cross-cutting capabilities, such as configuration, dependency injection (DI), and logging.

This means that there's not just a WebHost anymore, there's a Generic Host for non-web-hosting scenarios. You get the same feeling as with ASP.NET Core and all the cool features like DI, logging, and config. The sample code for a Generic Host is up on GitHub.

IHostedService

A way to run long running background operations in both the generic host and in your web hosted applications. ASP.NET Core 2.1 added support for a BackgroundService base class that makes it trivial to write a long running async loop. The sample code for a Hosted Service is also up on GitHub.

Check out a simple Timed Background Task:

public Task StartAsync(CancellationToken cancellationToken)

{
_logger.LogInformation("Timed Background Service is starting.");

_timer = new Timer(DoWork, null, TimeSpan.Zero,
TimeSpan.FromSeconds(5));

return Task.CompletedTask;
}

Fun!

Windows Services on .NET Core

You can now host ASP.NET Core inside a Windows Service! Lots of people have been asking for this. Again, no need for IIS, and you can host whatever makes you happy. Check out Microsoft.AspNetCore.Hosting.WindowsServices on NuGet and extensive docs on how to host your own ASP.NET Core app without IIS on Windows as a Windows Service.

public static void Main(string[] args)

{
var pathToExe = Process.GetCurrentProcess().MainModule.FileName;
var pathToContentRoot = Path.GetDirectoryName(pathToExe);

var host = WebHost.CreateDefaultBuilder(args)
.UseContentRoot(pathToContentRoot)
.UseStartup<Startup>()
.Build();

host.RunAsService();
}

IHostingStartup - Configure IWebHostBuilder with an Assembly Attribute

Simple and clean with source on GitHub as always.

[assembly: HostingStartup(typeof(SampleStartups.StartupInjection))]

Shared Source Packages

This is an interesting one you should definitely take a moment and pay attention to. It's possible to build packages that are used as helpers to share source code. We internally call these "shared source packages." These are used all over ASP.NET Core for things that should be shared BUT shouldn't be public APIs. These get used but won't end up as actual dependencies of your resulting package.

They are consumed like this in a CSPROJ. Notice the PrivateAssets attribute.

<PackageReference Include="Microsoft.Extensions.ClosedGenericMatcher.Sources" PrivateAssets="All" Version="" />

<PackageReference Include="Microsoft.Extensions.ObjectMethodExecutor.Sources" PrivateAssets="All" Version="" />

ObjectMethodExecutor

If you ever need to invoke a method on a type via reflection and that method could be async, we have a helper that we use everywhere in the ASP.NET Core code base that is highly optimized and flexible called the ObjectMethodExecutor.

The team uses this code in MVC to invoke your controller methods. They use this code in SignalR to invoke your hub methods. It handles async and sync methods. It also handles custom awaitables and F# async workflows

SuppressStatusMessages

A small and commonly requested one. If you hate the output that dotnet run gives when you host a web application (printing out the binding information) you can use the new SuppressStatusMessages extension method.

WebHost.CreateDefaultBuilder(args)

.SuppressStatusMessages(true)
.UseStartup<Startup>();

AddOptions

They made it easier in 2.1 to configure options that require services. Previously, you would have had to create a type that derived from IConfigureOptions<TOptions>, now you can do it all in ConfigureServices via AddOptions<TOptions>

public void ConfigureServicdes(IServiceCollection services)

{
services.AddOptions<MyOptions>()
.Configure<IHostingEnvironment>((o,env) =>
{
o.Path = env.WebRootPath;
});
}

IHttpContext via AddHttpContextAccessor

You likely shouldn't be digging around for IHttpContext, but lots of folks ask how to get to it and some feel it should be automatic. It's not registered by default since having it has a performance cost. However, in ASP.NET Core 2.1 a PR was put in for an extension method that makes it easy IF you want it.

services.AddHttpContextAccessor();

So ASP.NET Core 2.1 is out and ready to go

New features in this release include:

Check out What's New in ASP.NET Core 2.1 in the ASP.NET Core docs to learn more about these features. For a complete list of all the changes in this release, see the release notes.

Go give it a try. Follow this QuickStart and you can have a basic Web App up in 10 minutes.


Sponsor: Check out JetBrains Rider: a cross-platform .NET IDE. Edit, refactor, test and debug ASP.NET, .NET Framework, .NET Core, Xamarin or Unity applications. Learn more and download a 30-day trial!



© 2018 Scott Hanselman. All rights reserved.
     

Microsoft Azure Data welcomes attendees to ACM SIGMOD/PODS 2018

$
0
0

Hello SIGMOD attendees!

Welcome to Houston, and to what is shaping up to be a great conference.  I wanted to take this opportunity to share with you some of the exciting work in data that’s going on in the Azure Data team at Microsoft, and to invite you to take a closer look.

Microsoft has long been a leader in database management with SQL Server, recognized as the top DBMS by Gartner for the past three years in a row.  The emergence of the cloud and edge as the new frontiers for computing, and thus data management, is an exciting direction—data is now dispersed within and beyond the enterprise, on-prem, on-cloud, and on edge devices, and we must enable intelligent analysis, transactions, and responsible governance for all data everywhere, from the moment it is created to the moment it is deleted, through the entire life-cycle of ingestion, updates, exploration, data prep, analysis, serving, and archival. 

These trends require us to fundamentally re-think data management.  Transactional replication can span continents.  Data is not just relational.  Interactive, real-time, and streaming applications with enterprise level SLAs are becoming common.  Machine learning is a foundational analytic task and must be supported while ensuring that all data governance policies are enforced, taking full advantage of advances in deep learning and hardware acceleration via FPGAs and GPUs. Microsoft is a very data-driven company, and to support product teams such as Bing, Office, Skype, Windows, and Xbox, the Azure Data team operates some of the world’s largest data services for our internal customers.  This grounds our thinking about how data management is changing, through close collaborations with some of the most sophisticated application developers on the planet. Of course, even in a company like Microsoft, the majority of data owners and users are domain experts, not database experts. This means that even though the underlying complexity of data management is growing rapidly, we need to greatly simplify how users deal with their data.  Indeed, a large cloud database service contains millions of customer databases, and requires us to handle many of the tasks previously handled by database administrators.

We are excited about the opportunity to re-imagine data management. We are making broad and deep investments in SQL Server and open source technologies, in Azure data services that leverage them, as well as in new built-for-cloud technologies.  We have an ambitious vision of the future of data management that embraces Data Lakes and No SQL, on-prem, in the cloud, and on edge devices. There has never been a better time to be part of database systems innovation at Microsoft, and we invite you to explore the opportunities to be part of our team.

This is an exciting time in our industry with many companies competing for talent and customers, and as you consider options, we want to highlight how Microsoft is differentiated in several respects. First, we have a unique combination of a world-class data management ecosystem in SQL Server and a leading public cloud in Azure.  Second, our culture puts customers first with a commitment to bringing them the best of open source technologies alongside the best of Microsoft.  Third, we have a deep commitment to innovation; product teams collaborate closely with research and advanced development groups at Cloud Information Services Lab, Gray Systems Lab, and Microsoft Research to go farther faster, and to maintain strong ties with the research community.

In this blog, I’ve listed the many services and ongoing work in the Azure Data group at Microsoft, together with links that will give you a closer look.  I hope you will find these of interest.

Have a great SIGMOD conference!

Rohan Kumar

CVP, Azure Data


Azure Data blogs:

The Microsoft data platform continues momentum for cloud-scale innovation and migration

Azure data blog posts by Rohan Kumar

 

SQL Server

Industry leading database management system, now available on Linux/Docker and Windows, on-prem and in the cloud.

The SQL Server blog

SQL Server 2017

 

Azure Cosmos DB: The industry’s first globally-distributed, multi-model database service

Azure Cosmos DB is the first globally-distributed data service that lets you elastically scale throughput and storage across any number of geographical regions while guaranteeing low latency, high availability and consistency – backed by the most comprehensive SLAs in the industry. Azure Cosmos DB is built to power today’s IoT and mobile apps, and tomorrow’s AI-hungry future.
It is the first cloud database to natively support a multitude of data models and popular query APIs, is built on a novel database engine capable of ingesting sustained volumes of data and provides blazing-fast queries – all without having to deal with schema or index management. And it is the first cloud database to offer five well-defined consistency models, so you can choose just the right one for your app.

Azure Cosmos DB: The industry's first globally-distributed, multi-model database service

 

Azure Data Lake Analytics - An on-demand analytics job service to power intelligent action

Easily develop and run massively parallel data transformation and processing programs in U-SQL, R, Python, and .NET over petabytes of data. With no infrastructure to manage, you can process data on demand, scale instantly, and only pay per job. There is no infrastructure to worry about because there are no servers, virtual machines, or clusters to wait for, manage, or tune. Instantly scale the processing power for any job.

Azure Data Lake Analytics

 

Azure Data Catalog – Find, understand and govern data

Companies collectively own Exabytes of data across hundreds of billions of streams, files, tables, reports, spreadsheets, VMs, etc. But for the most part companies don't know where their data is, what's in it, how it's being used or if it's secure. Our job is to fix that. We are building an automated infrastructure to find & classify any form of data anywhere it lives, from on-premise to cloud, figure out what the data contains, and how it should be managed/protected. Our global scale infrastructure will track all of this data and provide a search engine to make the data discoverable and easily re-usable. From AI to distributed systems we are using cutting edge technologies to make data useful.

 

Azure Data Factory – Managed, hybrid data integration service at scale

Create, schedule, and manage your data integration at scale. Work with data wherever it lives, in the cloud or on-premises, with enterprise-grade security. Accelerate your data integration projects by taking advantage of over 70 available data source connectors. Use the graphical user interface to build and manage your data pipelines. Transform raw data into finished, shaped data ready for consumption by business intelligence tools or custom applications. Easily lift your SQL Server Integration Services (SSIS) packages to Azure, and let Azure Data Factory manage your resources so you can increase productivity and lower total cost of ownership (TCO).

Azure Data Factory

 

Azure HDInsight – Managed cluster service for the full spectrum of Hadoop & Spark

Azure HDInsight is a fully-managed cluster service that makes it easy, fast, and cost-effective to process massive amounts of data. Use popular open-source frameworks such as Hadoop, Spark, Hive, LLAP, Kafka, Storm, ML Services & more. Azure HDInsight enables a broad range of scenarios such as ETL, Data Warehousing, Machine Learning, IoT and more. A cost-effective service that is powerful and reliable. Pay only for what you use. Create clusters on demand, then scale them up or down. Decoupled compute and storage provide better performance and flexibility.

Azure HDInsight

 

Azure SQL DB - The intelligent relational cloud database service

The world’s first intelligent database service that learns and adapts with your application, enabling you to dynamically maximize performance with very little effort on your part. Working around the clock to learn, profile and detect anomalous database activities, Threat Detection identifies potential threats to the database, and like an airplane’s flight data recorder, Query Store collects detailed historical information about all queries, greatly simplifying performance forensics by reducing the time to diagnose and resolve issues.

Azure SQL Database

Adaptive query processing support for Azure SQL Database

Improved Automatic Tuning boosts your Azure SQL Database performance

Query Store: A flight data recorder for your database

 

Azure SQL DW - Fast, flexible, and secure analytics platform

A fast, fully managed, elastic, petabyte-scale cloud data warehouse. Azure SQL Data Warehouse lets you independently scale compute and storage, while pausing and resuming your data warehouse within minutes through a distributed processing architecture designed for the cloud. Seamlessly create your hub for analytics along with native connectivity with data integration and visualization services, all while using your existing SQL and BI skills. Through PolyBase, Azure Data Warehouse brings data lake data into the warehouse with support for rich queries over files and unstructured data.

Azure SQL Data Warehouse

Blazing fast data warehousing with Azure SQL Data Warehouse

Azure SQL Database Threat Detection, your built-in security expert

Azure SQL Data Warehouse now generally available in all Azure regions worldwide

 

Azure Stream Analytics - A serverless real-time analytics service to power intelligent action

Easily develop and run massively parallel real-time analytics on multiple IoT or non-IoT streams of data using a simple SQL-like language. Use custom code for advanced scenarios. With no infrastructure to manage, you can process data on-demand, scale instantly, and only pay per job. Azure Stream Analytics seamlessly integrates with Azure IoT Hub and Azure IoT Suite. Azure Stream Analytics is also available on Azure IoT Edge enabling near-real-time analytical intelligence closer to IoT devices so that they can unlock the full value of device-generated data.

Azure Stream Analytics

 

Gray Systems Lab (GSL)  and Cloud and Information Services Lab (CISL)

The Azure Data Office of the CTO is the home of the Gray Systems Lab (GSL) and the Cloud and Information Services Lab (CISL). These are applied research groups, employing scientists and engineers focusing on everything data: from SQL Server in the cloud to massive-scale distributed systems for Big Data, from hardware-accelerated data processing to self-healing streaming solutions. The labs were founded in 2008 and 2012 respectively, have since produced 70+ papers in top database and system conferences, have filed numerous patents, and are continuously engaged in deep academic collaborations, through internships and sponsorship of academic projects. This research focus is matched with an equally strong commitment to open-source (with several committers in Apache projects who together contributed over 500K LOC to open-source), and continuous and deep engagement with product teams. Strong product partnerships provide an invaluable path for innovation to affect the real world—the labs’ innovations today govern hundreds of thousands of servers in our Big Data infrastructure, and ship with several cloud and on-premise products. Microsoft’s cloud-first strategy gives researchers in our group a unique vantage point, where PBs of telemetry data and first-party workloads make every step of our innovation process principled and data-driven. The labs have locations in Microsoft’s Redmond and Silicon Valley campuses, as well as in the UW-Madison campus. Contact: Carlo Curino carlo.curino@microsoft.com or Alan Halverson alanhal@microsoft.com to hear more.

Azure.Source – Volume 35

$
0
0

Featuring Infrastructure-as-a-Service on Azure

Why you should bet on Azure for your infrastructure needs, today and in the future - We are committed to providing the right infrastructure for every workload. Across our 50 Azure regions, you can pick from a broad array of virtual machines with varying CPU, GPU, Memory and disk configurations for your application needs. In this post on the Azure blog, Corey Sanders, Corporate Vice President, Azure, shares a few key reasons you should bet on Azure for your infrastructure needs and announces some new capabilities. In addition, these blog posts expand on items covered in Corey's post:

  • (Preview) Standard SSD Disks for Azure Virtual machine workloads - Azure Standard SSD Managed Disks, a new type of durable storage for Microsoft Azure Virtual machines, is now available in preview. Azure Standard SSDs provide consistent performance for low IOPS workloads and deliver better availability and latency compared to HDD Disks. Azure Standard SSDs are also a cost-effective Managed Disk solution optimized for dev-test and entry-level production applications requiring consistent latency.
  • Microsoft Azure Stack expands availability to 92 countries - We are expanding the geographical coverage of Azure Stack to meet the growing demand of the customers globally. Azure Stack will now be available in 92 countries throughout the world.
  • General availability: Disaster recovery for Azure IaaS virtual machines - Disaster recovery for Azure IaaS virtual machines using Azure Site Recovery is now generally available. The cross-region DR feature is generally available in all Azure public regions where Site Recovery is available.
  • Azure Backup for SQL Server on Azure now in public preview - This enterprise backup provides a new breakthrough in backup that differentiates Azure from any other public cloud. This workload backup capability is built as an infrastructure-less, Pay as You Go (PAYG) service that leverages native SQL backup and restore APIs to provide a comprehensive solution to backup SQL servers running in Azure IaaS VMs.
  • Azure Webinar Series: 4 Reasons to Choose Azure for Your Infrastructure Needs - Microsoft Azure Infrastructure as a Service (IaaS) provides computing, networking, and storage services with a high degree of control, security, and simplicity. In this webinar, you’ll learn the advantages of IaaS, see how it differs from competitors, and walk away with ideas for moving your workloads to the cloud.

Now in preview

Speech services now in preview - As announced at Build 2018, Speech service is available in preview, including Speech to Text with custom speech, Text to Speech with custom voice, and Speech Translation. In addition, the Speech SDK is also available in preview, which will be the single SDK for most of our speech services, and will require only one Azure subscription key for speech recognition and LUIS (language understanding service). The Speech Devices SDK is available as a restricted preview to approved device partners.

App Service Deployment Center (Preview) - App Service Deployment Center is a new experience in preview for setting up deployments to Azure App Service. It provides a centralized overview for all of the deployment options available to you and a guided experience to set up your deployments.

Azure AD Conditional Access support for blocking legacy auth is in Public Preview! - Azure AD Conditional Access support for blocking legacy authentication is available in public preivew, which enables you to manage legacy authentication blocking as one part of your overall conditional access strategy, all from right in the Azure AD admin console.

Process more files than ever and use Parquet with Azure Data Lake Analytics - Azure Data Lake Analytics (ADLA) increased the scale limit to schematize and process several hundred thousand files in a single U-SQL job using so-called file sets. In addition, we have improved the handling of many small files by grouping up to 200 files of at most 1 GB of data into a single vertex (in preview). Also in this latest release, ADLA adds a public preview of the native extractor and outputter for the popular Parquet file format and a “private” preview for Optimized Row Columnar (ORC), making it easy to both consume and produce these popular data formats at large scale.

Also in preview

Now generally available

Offering the largest scale and broadest choice for SAP HANA in the cloud - Microsoft is committed to offering the most scale and performance for SAP HANA in the public cloud, and last week announced additional SAP HANA offerings on Azure at SAPPHIRE NOW 2018 in Orland, Florida. The SAP Cloud Platform offers developers a choice to build their SAP applications and extensions using a PaaS development platform with integrated services, which is now generally available on Azure. Developers can now deploy Cloud Foundry based SAP Cloud Platform on Azure in the West Europe region. We’re working with SAP to enable more regions in the months ahead.

News and updates

Use Azure Monitor to integrate with SIEM tools - We’ve been partnering with the top security information and event management (SIEM) partners to build connectors that get the data from Azure Monitor into their tools. This post covers what you should do based off the SIEM tool(s) you are using and your current integration status. Azure Monitor’s SIEM integration capabilities can’t do everything the Azure Log Integration tool could do just yet, so this post also provides a roadmap for addressing known gaps between them.

Azure Search is now certified for several levels of compliance - Compliance is an important factor for customers when looking at software and services as they look to meet their own compliance obligations across regulated industries and markets worldwide. For that reason, we are excited to announce that Azure Search has been certified for several levels of compliance, such as HIPPA and the HITECH Act. Azure compliance offerings are grouped into four segments: globally applicable, US government, industry specific, and region/country specific.

Azure Data Lake Tools for VSCode supports Azure blob storage integration - The Azure Data Lake Tool for VSCode is an extension for developing U-SQL projects with Microsoft Azure Data Lake. The integration of VSCode explorer with Azure blob storage enables you to navigate your blob storage, access and manage your blob container, folder and files using the Data Lake Explorer.

Additional news and updates

The Azure Podcast

The Azure Podcast: Episode 232 - Java on Azure - Continuing our sessions from BUILD 2018, here is a great discussion with Asir Selvasingh on running Java on Azure.

Technical content and training

Get started with U-SQL: It’s easy! - Azure Data Lake Analytics combines declarative and imperative concepts in the form of a new language called U-SQL, wich is the big data query language and execution framework in the Azure Data Lake Analytics. This post provides a the steps you need to get ramped up on U-SQL.

Use Azure Data Lake Analytics to query AVRO data from IoT Hub - Apache Avro is a framework that provides a file format for data serialization and remote procedure calls. Using U-SQL and an Avro extractor, we can parse, transform, and query our IoT Hub data. IoT Hub will write blob content in Avro format, which has both message body and message properties. This post includes an episode of the IoT Show that demonstrates using Azure Data Lake Analytics to analyze data from IoT devices collected through Azure IoT Hub.

Cybersecurity Reference Architecture: Security for a Hybrid Enterprise - The Microsoft Cybersecurity Reference Architecture describes Microsoft’s cybersecurity capabilities and how they integrate with existing security architectures and capabilities. We made quite a few changes in v2 and this post highlights some of what has changed as well as the underlying philosophy of how this document was built.

Detecting script-based attacks on Linux - In April, Azure Security Center (ASC) extended its Linux threat detection preview program to include detection of suspicious processes, suspect login attempts, and anomalous kernel module loads. This post demonstrates how existing Windows detections often have Linux analogs, such as base64-encoded shell and script attacks.

App Service Diagnostics – Profiling an ASP.NET Web App on Azure App Service - This post covers the Collect .NET Profiler Trace option in detail and how you can use it to troubleshoot a slow or a failing ASP.NET based Web App. The trace it collects will help you identify the underlying .NET exceptions, time taken in the various request processing pipeline stages and even lets you drill down into exact methods within the application code that are taking time.

8 reasons to choose Azure Stream Analytics for real-time data processing - Azure Stream Analytics is Microsoft’s serverless real-time analytics offering for complex event processing. It enables customers to unlock valuable insights and gain competitive advantage by harnessing the power of big data. This post provides eight reasons why you should choose ASA for real-time analytics.

Azure tips & tricks

Load Testing web app with Azure App Services

Working with App Settings and Azure App Service

Customers and partners

Regenerative Maps alive on the Edge - Mapbox announced it will integrate its Vision SDK with the Microsoft Azure IoT platform, enabling developers to build innovative applications and solutions for smart cities, the automotive industry, public safety, and more. Microsoft’s collaboration with Mapbox illustrates our commitment to cloud and edge computing in conjunction with the proliferation of Artificial Intelligence. The Mapbox Vision SDK, plus Azure IoT Hub and Azure IoT Edge, simplifies the flow of data from the edge to the Azure cloud and back out to devices.

Azure Marketplace new offers: May 1–15 - The Azure Marketplace is the premier destination for all your software needs – certified and optimized to run on Azure. Find, try, purchase, and provision applications & services from hundreds of leading software providers. In the first half of May we published 28 new offers, including: Solar inCode, Infrastructure for SAP Netweaver and SAP HANA, and TeamCity.

Mellanox uses Azure to accelerate network design - Mellanox’s journey to running HPC workloads in Azure started a few years ago. At first, they were looking for services like disaster recovery, where they could keep the environment deprovisioned most of the time. Then they started looking at moving services like backups to the cloud. Mellanox worked with Univa, a leading provider of HPC scheduling and orchestration software to evaluate different public cloud options, and unltimately chose Azure.

Using STMicroelectronics starter kits to connect to Azure IoT in minutes - STMicroelectronics offers a wide range of IoT hardware along with pre-integrated software, a powerful development ecosystem and valuable starter kits. Microsoft partners with silicon vendors such as STMicroelectronics to simplify and accelerate the development of embedded systems, so our customers can move projects from proof of concepts to production faster.

The IoT Show | STMicroelectronics Starter kits for Azure IoT - Manuel Cantone, from STMicroelectronics, presents latest developer kits that will seamlessly connect to Azure IoT. These kits allow to get started experiencing Azure IoT in a matter of minutes...demonstrated live!

Events

AltConf: Empowering developers to ship iOS apps that scale - Whether you are an Objective-C or Swift developer, Azure has what you need to ship your apps faster and with more confidence. Last week, Microsoft was a Gold Sponsor at AltConf, a free, community-driven event that happens in parallel with Apple's WWDC. We were there to talk about how we can help you build intelligent iOS apps that scale. Check out the information in this post to learn more.

Spark + AI Summit: Azure is the best place for analytics - To continue driving innovation for our customers, we made a couple of exciting enhancements to Azure Databricks that we announced last week at the Spark + AI Summit: support for GPU-enabled virtual machines to build, train and deploy AI models at scale with Azure Databricks, and enhanced capabilities for Azure Databricks Runtime for Machine Learning in preview as part of the premium SKU in Azure Databricks.

Azure Friday

Azure Friday | Episode 437 - TITLE - Carey MacDonald stops by to chat with Scott Hanselman about Azure Cosmos DB, a managed database service with unlimited horizontal scale, allowing you to scale storage and throughput independently. In this video, Carey will cover how to partition your data in Azure Cosmos DB, best practices for choosing a partition key, and how to troubleshoot bad partition key choices.

Azure Friday | Episode 438 - TITLE - Jayanta Mondal stops by to chat with Scott Hanselman about Gremlin, the traversal query language for Cosmos DB graph. Gremlin being a dataflow language and procedural is nature, writing efficient Gremlin queries requires the knowledge of graph structure and the query execution plan. Learn how to structure a Gremlin query, and what are the best practices, and tips & tricks to get the best performance.

Developer spotlight

A decision tree for Azure compute services - This flowchart will help you to choose an Azure compute service for your application by guiding you through a set of key decision criteria to reach a recommendation.

KubeCon 2018 Go-Node Remote Debug - The end-to-end multi-service remote container debugging demo from KubeCon 2018 in Copenhagen.

Tutorial: Java with Docker in VS Code - This tutorial will walk you through building and deploying a Docker image for a Java application with Visual Studio Code.

.NET Microservices Architecture e-book - This guide is an introduction to developing microservices-based applications and managing them using containers. It discusses architectural design and implementation approaches using .NET Core and Docker containers.

Building Resilient Microservices with .NET Core and AKS - In this session from Build 2018 we'll show you how we're making .NET Core microservices easier to build with new application patterns in .NET Core 2.1 as well as how to deploy and manage them with Kubernetes and Helm.

The IoT Show

The IoT Show | Azure IoT Hub device SDK for Python - You are a Python developer and you are wondering how you can get your feet wet developing for Azure IoT? We got you covered! Zoltan Varga, lead developer on the Azure IoT SDK for Python, joined us on the IoT Show to show us how it's done!

AI Show

AI Show | Video Moderation with Content Moderator - Content moderation is the process of monitoring for possible offensive, undesirable, and risky content. Content Moderator, a Cognitive Services product, combines machine-assisted content moderation APIs and human review tool for images, text, and videos into a complete content moderation solution. In this episode, we will get an overview of Content Moderator and learn about its video moderation capabilities. We will cover the API and the human review capabilities in Content Moderator.

AI Show | Image and Text Moderation with Content Moderator - Content moderation is the process of monitoring for possible offensive, undesirable, and risky content. Content Moderator, a Cognitive Services product, combines machine-assisted content moderation APIs and human review tool for images, text, and videos into a complete content moderation solution. In this episode, we will get an overview of Content Moderator and learn about the image moderation and text moderation capabilities. We will cover all features of both APIs.

Azure this Week

Azure this Week - 8 June 2018 - In this episode of Azure This Week, James takes a look at the public preview of Storage Explorer in the Azure Portal, the public preview of Standard SSD Disks for Azure VMs, the General Availability of cross-region disaster recovery for Azure VMs and the announcement that Microsoft is to acquire GitHub.

Azure Blob Storage lifecycle management in public preview

$
0
0

Last year, we released Blob-Level Tiering which allows you to transition blobs between the Hot, Cool, and Archive tiers without moving data between accounts. Both Blob-Level Tiering and Archive Storage help you optimize storage performance and cost. You asked us to make it easier to manage and automate, so we did. Today we are excited to announce the public preview of Blob Storage lifecycle management so that you can automate blob tiering and retention with lifecycle management policies.

Lifecycle management

Data sets have unique lifecycles. Some data is accessed often early in the lifecycle, but the need for access drops drastically as the data ages. Some data remain idle in the cloud and is rarely accessed once stored. Some data expire days or months after creation while other data sets are actively read and modified throughout their lifetimes. Azure Blob Storage lifecycle management offers a rich, rule-based policy which you can use to transition your data to the best access tier and to expire data at the end of its lifecycle.

Lifecycle management policy helps you:

  • Transition blobs to a cooler storage tier (Hot to Cool, Hot to Archive, or Cool to Archive) to optimize for performance and cost
  • Delete blobs at the end of their lifecycles
  • Define rules to be executed once a day at the storage account level (it supports both GPv2 and Blob storage accounts)
  • Apply rules to containers or a subset of blobs (using prefixes as filters)

See Managing the Azure Storage Lifecycle to learn more.

Example

Consider a data set that is accessed frequently during the first month, is needed only occasionally for the next two months, is rarely accessed afterward, and is required to be expired after seven years. In this scenario, Hot storage is the best tier to use initially, Cool storage is appropriate for occasional access, and Archive storage is the best tier option after several months before it is deleted seven years later.

The following sample policy manages the lifecycle for such data. It applies to block blobs with prefix “foo”:

  • Tier blobs to Cool storage 30 days after last modification
  • Tier blobs to Archive storage 90 days after last modification
  • Delete blobs 2,555 days (seven years) after last modification
  • Delete blob snapshots 90 days after snapshot creation
{
   "version": "0.5",
   "rules": [
     {
       "name": "ruleFoo",
       "type": "Lifecycle",
       "definition": {
         "filters": {
           "blobTypes": [ "blockBlob" ],
           "prefixMatch": [ "foo" ]
         },
         "actions": {
           "baseBlob": {
             "tierToCool": { "daysAfterModificationGreaterThan": 30 },
             "tierToArchive": { "daysAfterModificationGreaterThan": 90 },
             "delete": { "daysAfterModificationGreaterThan": 2555 }
           },
           "snapshot": {
             "delete": { "daysAfterCreationGreaterThan": 90 }
           }
         }
       }
     }
   ]
}

Azure portal

1616-1

How to get started

To enroll in public preview, you will need to submit a request to register this feature to your subscription. After your request is approved (within a few days), any existing and new GPv2 or Blob Storage account in West US 2 and West Central US will have the feature enabled. During preview, only block blob is supported. As with most previews, this feature should not be used for production workloads until it reaches GA.

To submit a request, run the following PowerShell or CLI commands.

PowerShell

Register-AzureRmProviderFeature -FeatureName DLM -ProviderNamespace Microsoft.Storage

CLI 2.0

az feature register –-namespace Microsoft.Storage –-name DLM

Cost

Lifecycle management feature is free of charge in preview. Customers are charged the regular operation cost for the List Blobs and Set Blob Tier API calls. See Block Blob pricing to learn more about pricing.

Get it, use it, and tell us about it

We are confident that Blob lifecycle management policy will simplify your cloud storage cost optimization strategy. We look forward to hearing your feedback on this feature through email at DLMFeedback@microsoft.com. As a reminder, we love hearing all of your ideas and suggestions about Azure Storage, which you can post at Azure Storage feedback forum.

Rendering in Azure with Qube 7

$
0
0

Rendering is the most compute-intensive part of a movie production. By some estimates rendering can account for 90 percent of the total compute hours over the entire life of an animated feature or visual effects-laden feature film. Rendering is a highly-complex, distributed process with dependencies between render jobs and digital assets.

To address this complexity companies like PipelineFX offer render management solutions that simplify the management and control of render pipelines. Their solution integrates on-premises environments with cloud service providers such as Microsoft Azure. This gives their customers access to scale and elasticity for accelerating the time-to-production.

PipelineFx recently announced their latest software release, Qube 7. It has a great set of new features that you can read more about on their website. But there’s one thing in particular that caught my eye, the Supervisor is now free to use. Not only that, but the rendering component is free for the first 30 days. This is true regardless of scale.
 
As you probably know already, Azure scales out to big numbers. With the 30 day trial of Qube and our low-priority Virtual Machines, you can get a lot of rendering done without spending a lot of money. Here’s what you need to do:

With the free trials, you can quickly get a taste for how Azure gives you the ability to quickly scale your render farm to meet your needs, while only paying for what you use. To learn more about rendering in Azure visit our Batch Rendering page. You can read more about how Qvisten used Azure to meet deadlines for Anchors Up on the PipelineFX site.

Viewing all 10804 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>