Quantcast
Channel: Category Name
Viewing all 10804 articles
Browse latest View live

Microsoft releases automation for HIPAA/HITRUST compliance

$
0
0

I am excited to share our new Azure Security and Compliance Blueprint for HIPAA/HITRUST – Health Data & AI. Microsoft’s Azure Blueprints are resources to help build and launch cloud-powered applications that comply with stringent regulations and standards. Included in the blueprints are reference architectures, compliance guidance and deployment scripts.

“The best part of the Azure Security & Compliance Blueprint is that it encompasses the exact Azure services architecture required to help customers meet their HIPAA and HITRUST security, privacy, and compliance obligations, along with supporting documentation and a fully-automated deployment process.”

- Tibi Popp, CTO, Archive360

Health organizations all over the world are looking to leverage the power of AI and the cloud to improve outcomes, accelerate performance, and enable the vision of precision medicine. “We are enthusiastic about the potential to foster multi-institutional collaborative environments for data sharing and machine learning,” said Chuck Mayo, PhD at the University of Michigan Medicine. Microsoft is working  to meet these challenges with Healthcare NExT, an initiative which aims to accelerate healthcare innovation through artificial intelligence and cloud computing, while at the same time working to protect the privacy and confidentiality of patients.

“We are entrusted with our customer’s and their patient’s most personal data. Cloud unlocks our ability to leverage this data and apply machine learning at scale to save more lives. Securing, governing, and protecting Protected Health Information (PHI) on cloud is an incredible opportunity and responsibility. The blueprint helps us draw from best practices to protect and leverage PHI on the Cloud (for scenarios like Length of Stay Prediction, Clinical Analytics, etc).“ 

- Dr. Ankur Teredesai, Chief Technology Officer, KenSci

The new blueprint provides secure implementation automation for building solutions in environments supporting Health Insurance Portability and Accountability Act (HIPAA), a US healthcare law that establishes safeguards for individually identifiable health information; as well as the Health Information Trust Alliance (HITRUST) framework, a widely recognized security accreditation in the healthcare industry. The blueprint is intended to serve as a modular foundation for customers to adjust to their specific requirements and accelerate the development of machine learning experiments to solve both clinical and operational use case scenarios.

“Clinical informatics teams are entrusted with a very time sensitive mission – to deliver the right care to the right patient at the right time. And our pace of innovation determines how many human lives we can impact. Solution frameworks like the blueprint accelerate my team’s mission to fight Death with Data Science by enabling quicker access to data and insight” 

- Dr. Greg McKelvey Chief Medical Officer, KenSci

components

The blueprint is designed to demonstrate how to deploy a secure end-to-end health solution that contains PHI, and includes:

  • A sample use case scenario: A machine learning experiment for predicting length of stay, a sample data set of 100,000 patient records formatted using Fast Healthcare Interoperability Resources (FHIR), and data visualization using Power BI.
  • Deployment template & automation scripts: Azure Resource Manager templates and PowerShell automation scripts are used to automatically deploy the components of the architecture into Azure by specifying configuration parameters during setup, and pre-load the same data and use case scenario.
  • Cybersecurity threat model & component architecture: A comprehensive threat model provided in tm7 format for use with the Microsoft Threat Modeling Tool, detailing the components of the solution, the data flows between them, and the trust boundaries. The threat model is designed to help customers better understand the points of potential risk in the component reference architecture.
  • Customer responsibility matrix: An Microsoft Excel workbook listing the relevant HIPAA/HITRUST requirements and explaining Microsoft and customer areas of responsibility.
  • External compliance review: A report produced by Coalfire Systems with an auditor's review of the solution, and considerations for transforming the blueprint into a production-ready deployment, covering both HIPAA and HITRUST. The blueprint provides all the building blocks to successfully start using a serverless cloud solution in under 25 minutes! For a quick look at how this solution works, watch this five-minute video explaining, and demonstrating blueprint deployment.

“The blueprint has helped IRIS to justify we are going in the right direction, and that we are not doing anything rogue when it comes to security, privacy, and compliance" 

- Jonathan Stevenson, Chief Information Officer of Intelligent Retinal Imaging Systems.

To learn more, join Microsoft at HIMSS 2018 and discover how health organizations across the globe are partnering with Microsoft to move beyond digitization into transformation and rallying with innovation. Learn more about Azure Blueprints on the Service Trust Portal. Learn more about Microsoft’s compliance with HIPAA and HITRUST on the Microsoft Trust Center.


Last week in Azure #21: Week of February 26

$
0
0

A post on the Microsoft Imagine blog (Students: Get career-ready with Azure for Students) announced the new Azure for Students offer, which gets verified students started with US$100 in Azure credits to be used within the first 12 months plus select free services (subject to change) without requiring a credit card at sign-up. For all the details, see the Azure for Students FAQ.

Now in preview

Announcing new milestones for Microsoft Cognitive Services vision and search services in Azure - Microsoft Custom Vision service, now available in public preview on the Azure portal, makes it possible for developers to easily train a classifier with their own data, export the models and embed these custom classifiers directly in their applications, and run it offline in real time on iOS, Android and many other edge devices. In addition, Bing Entity Search, which developers can use to identify the most relevant entity based on searched terms and provide primary details about those entities is now generally available on the Azure portal.

Now generally available

Confidently plan your cloud migration: Azure Migrate is now generally available! - Azure Migrate is offered at no additional charge and provides appliance-based, agentless discovery of your on-premises environments. It enables discovery of VMware-virtualized Windows and Linux VMs today and will enable discovery of Hyper-V environments in the future. It also provides an optional, agent-based discovery for visualizing interdependencies between machines to identify multi-tier applications.

StorSimple Data Manager now generally available - Transform data from StorSimple format into the native format in Azure blobs or Azure Files. Once your data is transformed, you can use services like Azure Media Services, Azure Machine Learning, HDInsight, Azure Search, and more.

Introducing Azure Advanced Threat Protection - Azure Advanced Threat Protection (ATP) is a cloud-based security solution that helps you detect and investigate security incidents across your networks. It supports the most demanding workloads of security analytics for the modern enterprise. Leveraging the scale and intelligence of Azure, when we detect a new possible threat or attack method, we can automatically update all active tenants. This means that your threat detection capabilities are always up to date.

News & updates

Updates to Azure Database for MySQL and Azure Database for PostgreSQL - We announced these services in preview last year, and we recently made a number of updates based on customer feedback. Changes include pricing tier tweaks, changing from compute units to vCores, flexibly configure and scale storage, and additional options for backup and geo-restore.

B-series burstable VM support in AKS now available - Burstable VM's (B-series) are significantly cheaper compared to standard and optimal recommended VM's like Standard_DS2_V2. B-series VM's are particularly suited for development and test environments where performance requirements are bursts rather than consistent.

Azure Service Bus now integrates with Azure Event Grid! - The key scenario this feature enables is for Service Bus queues, topics, or subscriptions with low message volumes to not require a receiver to be polling for messages at all times. Service Bus will now send events to Azure Event Grid when there are messages in a queue if no receivers are present.

Azure Load Balancer to become more efficient - We recently introduced an advanced, more efficient Load Balancer platform. This platform adds a whole new set of abilities for customer workloads using the new Standard Load Balancer. One of the key additions to the new Load Balancer platform, is a simplified, more predictable, and efficient outbound port allocation algorithm.

HDInsight Tools for VSCode integrates with Ambari and HDInsight Enterprise Secure Package - You can now connect HDInsight Tools for VSCode to an HDInsight cluster through Ambari for job submissions using an Ambari managed username and password, or domain username and password. The Ambari connection applies to Spark and Hive clusters in all the Azure environments that host HDInsight services./p>

Technical content

Transform your industry with new Microsoft IoT in Action webinar series - The IoT in Action webinars are a series of live virtual events for companies ready to capitalize on the multi-billion-dollar IoT market. These live and on-demand webinars show what’s possible with IoT while also educating on how to do it. Filled with IoT thought leadership and technical know-how shared by industry pioneers from Microsoft and our partners, each webinar is crafted to inspire development across the IoT partner ecosystem and help you better meet customers’ needs.

Security Center Playbooks and Azure Functions Integration with Firewalls - You can manually run a Security Center Playbooks when a Security Center alert is triggered, reducing time to response, and helping you stay in control of your security posture. This post looks at a specific example of how Azure Functions work with Security Center Playbooks to help you rapidly respond to detected threats against your Palo Alto VM-Series firewall.

Developer spotlight

App Center | Push Notifications - Learn how to engage your users by sending them targeted push notifications with App Center Push.

Your guide to Azure services for apps built with Xamarin - The Mobile apps using Xamarin + Azure poster is a one-stop guide to the most relevant cloud services that Azure has to offer to you as a mobile developer with Visual Studio and Xamarin.

Zero to App in 20 Minutes: Build Your First Chat App with Microsoft Azure, Visual Studio App Center, and Rapid - In 20 minutes or less, get your first real-time chat app up and running. In this post, you'll learn how (and why) to add “chat apps” to your StackOverflow Developer Story, GitHub repos, or [insert your preferred skills profile]. Understand basic chat requirements and what separates the good from the great. Plus, you'll get code samples and step-by-step docs to get started.

Connect 2017 | Get your continuous delivery pipeline up and running in a few minutes - Learn how to easily get up and running with your apps inside Visual Studio App Center. Sign in and connect your Android, iOS, and Windows apps to the continuous integration, automated testing, distribution, monitoring, and engagement capabilities of App Center.

Azure Friday | Remote Debug Azure Functions in Java Using VS Code - Xiaokai He shows Donovan Brown how to quickly develop and deploy serverless functions to Azure, then go inside the black box and debug functions locally, as well as remotely in the cloud.

Service updates

Azure shows

<!-- -->

Azure Friday | Serverless APIs with Azure Function Proxies - Alex Karcher joins Donovan Brown to discuss Azure Function Proxies, the serverless API toolbox. Proxies give you a truly serverless experience to manage your APIs with dynamic billing and scaling, and a super simple setup process.

EpisodeTitle - Abstract

EpisodeTitle - Abstract

The Azure Podcast | Episode 218 - Ramping up on ARM templates - Based on feedback from our listeners, we chat with Richard Cheney, a Cloud Solution Architect at Microsoft. He give us a loads of advice to help our listeners get ramped up on developing ARM templates, including 'Project Citadel' - a whole set of resources to help you craft the best templates.

Azure Data Lake tools for VS Code now supports job view and job monitoring

$
0
0

We are happy to announce that job monitoring and job view have been added into the Azure Data Lake Tools for Visual Studio Code. Now, you can perform real-time monitoring for the jobs you submit. You can also view job summary and job details for historical jobs as well as  download any of the input or output data and resources files associated with the job.

Key Customer Benefits

  • Monitor job progress in real-time within VSCode for both local and ADL jobs.
  • Display job summary and data details for historical jobs.
  • Resubmit previously run Enable jobs resubmission for an old job.
  • Download job inputs, outputs and resource data files.
  • View the job U-SQL script for a submitted job.

Summary of key new features

  • Job View Page: Display job summary and job progress within VSCode. 

job-view-summary

  • Data Page: Display job input, output and resources files. Support file download.

job-view-data1

  • Show Historical Jobs: Use command ADL: Show Jobs for both local and ADL historical jobs.

image

  • Set Default Context: Use command ADL: Set Default Context to set default context for current working folder.

image

How to install or update

Install Visual Studio Code and download Mono 4.2.x for Linux and Mac. Then get the latest Azure Data Lake Tools by going to the VSCode Extension repository or the VSCode Marketplace and searching Azure Data Lake Tools.

data-lake-tools-for-vscode-extensions

For more information about Azure Data Lake Tool for VSCode, please use the following resources:

Learn more about today’s announcements on the Azure blog and the Big Data blog. Discover more on the Azure service updates page.

If you have questions, feedback, comments, or bug reports, please use the comments below or send a note to hdivstool@microsoft.com.

ASP.NET Core 2.1.0-preview1: Functional testing of MVC applications

$
0
0

For ASP.NET Core 2.1 we have created a new package, Microsoft.AspNetCore.Mvc.Testing, to help streamline in-memory end-to-end testing of MVC applications using TestServer.

This package takes care of some of the typical pitfalls when trying to test MVC applications using TestServer.

  • It copies the .deps file from your project into the test assembly bin folder.
  • It sets the content root the application's project root so that static files and views can be found.
  • It provides a class WebApplicationTestFixture<TStartup> that streamlines the bootstrapping of your app on TestServer.

Create a test project

To try out the new MVC test fixture, let's create an app and write an end-to-end in-memory test for the app.

First, create an app to test.

dotnet new razor -au Individual -o TestingMvc/src/TestingMvc

Add an xUnit based test project.

dotnet new xunit -o TestingMvc/test/TestingMvc.Tests

Create a solution file and add the projects to the solution.

cd TestingMvc
dotnet new sln
dotnet sln add src/TestingMvc/TestingMvc.csproj
dotnet sln add test/TestingMvc.Tests/TestingMvc.Tests.csproj

Add a reference from the test project to the app we're going to test.

dotnet add test/TestingMvc.Tests/TestingMvc.Tests.csproj reference src/TestingMvc/TestingMvc.csproj

Add a reference to the Microsoft.AspNetCore.Mvc.Testing package.

dotnet add test/TestingMvc.Tests/TestingMvc.Tests.csproj package Microsoft.AspNetCore.Mvc.Testing -v 2.1.0-preview1-final

In the test project create a test using the WebApplicationTestFixture<TStartup> class that retrieves the home page for the app. The test fixture sets up an HttpClient for you that allows you to invoke your app in-memory.

using Xunit;

namespace TestingMvc.Tests
{
    public class TestingMvcFunctionalTests : IClassFixture<WebApplicationTestFixture<Startup>>
    {
        public MyApplicationFunctionalTests(WebApplicationTestFixture<Startup> fixture)
        {
            Client = fixture.Client;
        }

        public HttpClient Client { get; }

        [Fact]
        public async Task GetHomePage()
        {
            // Arrange & Act
            var response = await Client.GetAsync("/");

            // Assert
            Assert.Equal(HttpStatusCodes.OK, response.StatusCode);
        }
    }
}

To correctly invoke your app the test fixture tries to find a static method on the entry point class (typically Program) of the assembly containing the Startup class with the following signature:

public static IWebHostBuilder CreateWebHostBuilder(string [] args)

Fortunately the built-in project templates are already setup this way:

namespace TestingMvc
{
    public class Program
    {
        public static void Main(string[] args)
        {
            CreateWebHostBuilder(args).Build().Run();
        }

        public static IWebHostBuilder CreateWebHostBuilder(string[] args) =>
            WebHost.CreateDefaultBuilder(args)
                .UseStartup<Startup>();
    }
}

If you don't have the Program.CreateWebHostBuilder method the text fixture won't be able to initialize your app correctly for testing. Instead you can configure the WebHostBuilder yourself by overriding CreateWebHostBuilder on WebApplicationTestFixture<TStartup>.

Specifying the app content root

The test fixture will also attempt to guess the content root of the app under test. By convention the test fixture assumes the app content root is at <<SolutionFolder>>/<<AppAssemblyName>>. For example, based on the folder structure defined below, the content root of the application is defined as /work/MyApp.

/work
    /MyApp.sln
    /MyApp/MyApp.csproj
    /MyApp.Tests/MyApp.Tests.csproj

Because we are using a different layout for our projects we need to inherit from WebApplicationTestFixture and pass in the relative path from the solution to the app under test when calling the base constructor. In a future preview we plan to make configuration of the content root unnecessary, but for now this explicit configuration is required for our solution layout.

public class TestingMvcTestFixture<TStartup> : WebApplicationTestFixture<TStartup> where TStartup : class
{
    public TestingMvcTestFixture()
        : base("src/TestingMvc") { }
}

Update the test class to use the derived test fixture.

public class TestingMvcFunctionalTests : IClassFixture<TestingMvcTestFixture<Startup>>
{
    public TestingMvcFunctionalTests(TestingMvcTestFixture<Startup> fixture)
    {
        Client = fixture.Client;
    }

    public HttpClient Client { get; }

    [Fact]
    public async Task GetHomePage()
    {
        // Arrange & Act
        var response = await Client.GetAsync("/");

        // Assert
        Assert.Equal(HttpStatusCode.OK, response.StatusCode);
    }
}

For some end-to-end in-memory tests to work properly, shadow copying needs to be disabled in your test framework of choice, as it causes the tests to execute in a different folder than the output folder. For instructions on how to do this with xUnit see https://xunit.github.io/docs/configuring-with-json.html.

Run the test

Run the test by running dotnet test from the TestingMvc.Tests project directory. It should fail because the HTTP response is a temporary redirect instead of a 200 OK. This is because the app has HTTPS redirection middleware in its pipeline (see Improvements for using HTTPS) and base address setup by the test fixture is an HTTP address ("http://localhost"). The HttpClient by default doesn't follow these redirects. In a future preview we will update the text fixture to configure the HttpClient to follow redirects and also handle cookies. But at least now we know the test is successfully running the app's pipeline.

This test was intended to make simple GET request to the app's home, not test the HTTPS redirect logic, so let's reconfigure the HttpClient to use an HTTPS base address instead.

public TestingMvcFunctionalTests(TestingMvcTestFixture<Startup> fixture)
{
    Client = fixture.Client;
    Client.BaseAddress = new Uri("https://localhost");
}

Rerun the test and it should now pass.

Starting test execution, please wait...
[xUnit.net 00:00:01.1767971]   Discovering: TestingMvc.Tests
[xUnit.net 00:00:01.2466823]   Discovered:  TestingMvc.Tests
[xUnit.net 00:00:01.2543165]   Starting:    TestingMvc.Tests
[xUnit.net 00:00:09.3860248]   Finished:    TestingMvc.Tests

Total tests: 1. Passed: 1. Failed: 0. Skipped: 0.
Test Run Successful.

Summary

We hope the new MVC test fixture in ASP.NET Core 2.1 will make it easier to reliably test your MVC applications. Please give it a try and let us know what you think on GitHub.

New Git Features in Visual Studio 2017 Update 6

$
0
0
This week we released Visual Studio 2017 Update 6. In this release, you can now push, delete, and view all of the Git tags in your repository. Additionally, if you use Visual Studio Team Services (VSTS), you can checkout pull request branches making it easier to review, test, and build changes. To learn more about what else... Read More

Visibility into network activity with Traffic Analytics – now in public preview

$
0
0

Today, we are announcing the public preview of Traffic Analytics, a cloud-based solution that provides visibility into user and application traffic on your cloud networks.

Traffic Analytics analyzes NSG Flow Logs across Azure regions and equips you with actionable information to optimize workload performance, secure applications and data, audit your organization’s network activity and stay compliant.

With Traffic Analytics, you now can:

  • Gain visibility into network activity across your cloud networks. Solution provides insights on:

    • traffic flows across your networks between Azure and Internet, in Azure,  public cloud regions, VNETs and subnets.
    • inter-relationships between critical business services and applications.
    • applications and protocols on your network, without the need for sniffers or dedicated flow collector appliances.
  • Secure your network; Identify threats on your network, such as:

    • flows between your VMs and rogue networks.
    • network ports open to the Internet.
    • applications attempting Internet access.
    • anomalous network traffic behavior (e.g. back-end servers attempting connectivity, to servers outside your network etc.)
  • Improve performance of your applications by:

    • capacity planning - eliminate issues of over-provisioning or under utilization by monitoring utilization trends of VPN gateways and other services.
    • analyzing in-bound and out-bound flows.
    • understanding application access patterns (e.g. Where are the users located?, Can application latency be reduced by better workload placement?).
    • identification of traffic hotspots.

With this release, Traffic Analytics brings rich flow monitoring capabilities to your Azure cloud that are on-par with capabilities available on campus networks (via NetFlow, IPFIX, sFlow based tools), without the need for packet capture or flow collection appliances.

Get Started

Find out what’s happening on your cloud!  Check out the documentation and start using Traffic Analytics on your Azure network.

image

Figure 1: NOC View provides an overview of flows across various regions in Azure.

image

Figure 2: Geo-map showing traffic across Azure regions.  Red dots indicate sources of malicious traffic.

image

Figure 3: VNET conversation map, with a summary of communicating countries, data centers, hosts etc.

We are listening

We would love to hear from you.  Send us your suggestions via the User Voice page.

NCv3 VMs generally available, other GPUs expanding regions

$
0
0

When we announced the preview of our new NCv3 virtual machines back in November, I knew they’d be very popular with our customers. NCv3 brings NVIDIA’s latest GPU – the Tesla V100 – to our best-in-class HPC, machine learning, and AI products to bring huge amounts of value across a variety of industries. One preview customer told us their speech recognition models trained in less than 20 minutes, instead of the 1-2 hours that previous generation GPUs required. Another customer told us about the 40-50% performance boost they saw on their reservoir simulations.

With these fantastic customer success stories, I am ecstatic to announce that the NCv3 virtual machines are now generally available in the US East region. We'll be adding NCv3 to EU West and US South Central later this month. We'll add AP Southeast in April and UK South and IN Central in May.

But this isn’t the only GPU announcement I am making today. We’re also expanding our NV series, which enables powerful remote visualization applications, into the US East 2, US Gov Virginia, and Central India regions. And our ND series, designed for AI and machine learning workloads, are expanding into the US South Central, AP Southeast, US East, and EU West regions.

I’m excited by this major expansion of our GPU fleet. GPUs are at the heart of a lot of discovery and innovation. And with our tools like Batch, Batch AI, and the Data Science Virtual Machine images, Azure customers are in a great position to expand what is possible.

If you’re attending NVIDIA’s GPU Technology Conference (GTC) in San Jose later this month, stop by our booth (booth 603) and learn more from the team.

Thanks,
Corey

Choosing Priors: Double-Yolk Bayesian Egg

$
0
0
by Subhadeep (Deep) Mukhopadhyay and Douglas Fletcher, Department of Statistical Science, Temple University 

Bayesians and Frequentists have long been ambivalent toward each other. The concept of “Prior” remains the center of this 250 years old tug-of-war: frequentists view prior as a weakness that can cloud the final inference, whereas Bayesians view it as a strength to incorporate expert knowledge into the data analysis. So, the question naturally arises, how can we develop a Bayes-frequentist consolidated data analysis workflow that enjoys the best of both worlds?

To develop a “defendable and defensible” Bayesian learning model, we have to go beyond blindly ‘turning the crank’ based on a “go-as-you-like” [approximate guess] prior. A lackluster attitude towards prior modeling could lead to disastrous inference, impacting various fields from clinical drug development to presidential election forecasts. The real questions are: How can we uncover the blind spots of the conventional wisdom-based prior? How can we develop the science of prior model-building that combines both data and science [DS-prior] in a testable manner – a double-yolk Bayesian egg? Unfortunately, these questions are outside the scope of business-as-usual Bayesian modus operandi and require new ideas, which is the goal of the paper, “Bayesian Modeling via Goodness of Fit”. In the following, we demonstrate how to prepare the “Bayesian omelet” — the operational part — using the R package BayesGOF.

Our model-building approach proceeds sequentially as follows:

  1. it starts with a scientific (or empirical) parametric prior (g(theta;alpha,beta))
  2. inspects the adequacy and the remaining uncertainty of the elicited prior using a graphical exploratory tool,
  3. estimates the necessary “correction” for assumed (g) by looking at the data,
  4. generates the final statistical estimate (hat pi(theta)), and
  5. executes macro and micro-level inference.

Our algorithmic solution yields answers to all five of the phases using one single algorithm, which we will now demonstrate for rat tumor data. The rat tumor data consists of observations of endometrial stromal polyp incidence in (k=70) groups of rats. For each group, (y_i) is the number of rats with polyps and (n_i) is the total number of rats in the experiment. The dataset is available in the R package BayesGOF.

The Rat-data model: (y_i,overset{{rm ind}}{sim},mbox{Binomial}( n_i, theta_i)), (i=1,ldots,k), where the unobserved parameters (theta=(theta_1,ldots,theta_k)) are independent realizations from the unknown (pi(theta)).

Step 1. We begin by finding the starting parameter values for parametric conjugate (g sim Beta(alpha, beta)):

library(BayesGOF)
set.seed(8697)
data(rat)
###Use MLE to determine starting values
rat.start <- gMLE.bb(rat$y, rat$n)$estimate

We use our starting parameter values to run the main DS.prior function:

rat.ds <- DS.prior(rat, max.m = 6, rat.start, family = "Binomial")

Next we will discuss how to interpret and use this rat.ds object for exploratory Bayes modeling and prior uncertainty quantification.

Step 2. We display the U-function to quantify and characterize the uncertainty of the a priori selected (g):

plot(rat.ds, plot.type = "Ufunc")

Img1

The deviations from the uniform distribution (the red dashed line) indicates that our initial selection for (g), (text{Beta}(alpha = 2.3,beta = 14.1)), is incompatible with the observed data and requires repair; the data indicate that there are, in fact, two different groups of incidence in the rats.

Step 3a. Extract the parameters for the nonparametrically corrected prior (hat{pi}):

rat.ds
## $g.par
##     alpha      beta 
##  2.304768 14.079707 
## 
## $LP.coef
##        LP1        LP2        LP3 
##  0.0000000  0.0000000 -0.5040361

Therefore, our estimated DS(G.m) prior is given by: [hat{pi}(theta) = g(theta; alpha,beta)Big[1 - 0.52T_3(theta;G) Big].]

The DS-prior has a unique two-component structure that combines parametric (g), and a nonparametric (d) (which we call the U-function). Here (T_j(Theta;G)), (j = 1,ldots,m) are a specialized orthonormal basis given by (text{Leg}_j[G(Theta)]), members of LP-class of rank-polynomials. Note that ({rm DS}(G,m=0) equiv g(theta;alpha,beta)). The truncation point (m) reflects the concentration of true unknown (pi) around the pre-selected (g).

Step 3b. Plot the estimated DS prior (hat{pi}) along with the original parametric (g):

plot(rat.ds, plot.type = "DSg")

 

Img2

MacroInference

The term "MacroInference" aims to answer the following question: How to combine (k) binomial parameters to come up with an overall, macro-level aggregated statistical behavior of (theta_1,ldots,theta_k)? This is often important in applied analysis, as the limited sample size of a single study hardly provides adequate evidence for a definitive conclusion.

Step 4. Here we are interested in the overall macro-level inference by combining the (k=70) parallel studies. The group-specific modes along with their SEs can be computed as follows:

rat.macro.md <- DS.macro.inf(rat.ds, num.modes = 2 , iters = 25, method = "mode") 
rat.macro.md
##      1SD Lower Limit   Mode 1SD Upper Limit
## [1,]          0.0161 0.0340          0.0520
## [2,]          0.1442 0.1562          0.1681
plot(rat.macro.md)

Img3

MicroInference

"Microinference" refers to the process of using information from historical studies to improve the estimates of one or more studies of particular interest. This is known as “borrowing strength” in Bayesian inference literature. It is noteworthy to mention that the classical Stein’s shrinkage does not work for rat data due to the presence of multiple partially exchangeable studies. Our adaptive (or selective) shrinkage technology selectively borrows strength from ‘similar’ experiments in an automated manner, by answering the important question: where to shrink?.

Step 5. In addition to the earlier (k=70) studies for the rat tumor data, we have a current experimental study that shows (y_{71}=4) out of (n_{71}=14) rats developed tumors. The following code performs the desired microinference for (theta_{71}) (posterior distribution along with its mean and mode):

rat.y71.micro <- DS.micro.inf(rat.ds, y.0 = 4, n.0 = 14)
rat.y71.micro
## Posterior summary for y = 4, n = 14:
##  Posterior Mean = 0.1897
##  Posterior Mode = 0.1833
## Use plot(x) to generate posterior plot

Rat_micro2 Rat_shrink

The left plot (a) compares the posterior distributions for the parametric (g) (blue) and the DS posterior (red). The right plot (b) compares our adaptive shrinkage with Stein’s estimates. The vertical red triangles indicate the modes of the DS prior, while the blue triangle is the mode of the parametric (g). For additional real-data examples, please see the package vignette.

Conclusion

All most all modern scientific research utilizes domain-knowledge and data to come up with breakthrough results. But the fundamental problem of how to fuse these “approximate” scientific prior knowledge with the data at hand is not a settled issue even 250 years after the discovery of the Bayes law (for more details, see “Bayes’ Theorem in the 21st Century” and “Statistical thinking for 21st century scientists”). Bayesian modeling via goodness-of-fit technology, synthesized in the R package BayesGOF, allows us to determine a scientific prior that is consistent with the data at hand, in a systematic and principled way.


Visual Studio 2017 version 15.6, Visual Studio for Mac version 7.4 Released

$
0
0

Today, we released updates to both Visual Studio 2017 and Visual Studio for Mac. Start your download now while you browse the rest of this post: download Visual Studio 2017 version 15.6 or Visual Studio for Mac. We’ll trigger the update notification flag in the tools in the coming days.

Visual Studio 2017 version 15.6

I’ll highlight some of the major changes in this post, but to see the complete list of changes and features, please see our Release Notes.

Performance

We have kept working to improve many aspects of performance and with this update, three things stand out:

  • This update makes solution load for managed code in particular much faster. Users of .NET Core will experience an average of 20% faster load times, with a more noticeable improvement for solutions with 30+ projects.
  • The Debugger’s Threads window is even faster with this release. Now you can interact with Visual Studio while it processes data in the background, which is useful when you’re debugging multithreaded applications.
  • Many VS customers run extensions, which can impact performance. To help you troubleshoot performance issues, this update adds notifications for extensions that may be causing UI delays. This notification lets you directly disable the extension to improve performance or turn off future notifications.

Unit Testing

Improvements to .NET Unit Testing include real time test discovery and we added the hierarchy view to improve the navigation experience in the Test Explorer.

The real time test discovery feature finds any C# and Visual Basic tests, even if you haven’t built your managed project. This feature uses Roslyn to update the Test Explorer in just seconds as you add, remove, or edit tests. We also added options to configure test discovery.

Improvements to unit testing for C++ include Boost.Test item templates and the added support for the Boost dynamic library. Also, Visual Studio automatically discovers your tests in CMake projects (CTest, Google Test, and Boost.Test). After you build, you’ll receive an even more granular view of your tests:

C++

As part of our ongoing effort towards C++ 17 Standards Conformance, the C++ workload now includes support for stable_sort, partition, inline vector::emplace_back in parallel, and <memory_resource> as well as guaranteed copy elision so you don’t have to construct artificial copy or move constructor for types where copy elision will happen.

To make it easier to catch errors beyond your active configuration, IntelliSense errors for inactive configurations now appear as purple squiggles in the editor. You can set the number of configurations you want to process in Tools > Options > Text Editor > C/C++ > Advanced.

We also made C++ improvements for arithmetic overflow checks in C++ Core Check, single file code analysis, and throughput performance and advice. You can discover more features for increasing your productivity on our Visual C++ Team Blog.

.NET Mobile Development

This release adds a feature to configure your macOS build environment automatically, to make building iOS apps with Visual Studio on Windows easier. Visual Studio will handle the heavy lifting of setup, removing the need to install and update your Mac build machine manually.

We also added the ability for both Windows and Mac users to deploy iOS apps over the network with Wi-Fi debugging. To get started, simply pair a wireless device with Xcode, and use it as your deployment target.

Build Tools

The 15.6 Build Tools lets you build servers without a full Visual Studio installation. Build Tools now supports TypeScript and Node.js project types in addition to support for C++, ASP.NET, and .NET Core for Desktop projects. Other improvements to the MSBuild component of the Build Tools include the ability to easily and seamlessly leverage NuGet to resolve SDKs without extra package modification. We’ve created a SDK repository for the community to use. More information is available here, and please provide MSBuild feedback here.

Visual Studio for Mac version 7.4

Visual Studio for Mac version 7.4 is also available today. It includes improvements in performance and stability, as well as fixes for many of the top reported issues. This release includes support for macOS High Sierra and C# 7.1, and core architectural changes for C# editing (powered by Roslyn), resulting in improved IntelliSense performance and typing responsiveness.

You can read the complete release notes and access Visual Studio for Mac downloads on VisualStudio.com.

Share Your Feedback

As always, we want to know what you think. Please install Visual Studio 2017 Version 15.6 and Visual Studio for Mac and share your thoughts and concerns.

Please let us know any issues you have via the Report a Problem tool in Visual Studio. You can track your issues in the Visual Studio Developer Community where you can ask questions and find answers. You can also engage with us and other Visual Studio developers through our new Gitter community (requires GitHub account), make a product suggestion through UserVoice, or get free installation help through our Live Chat support.

John Montgomery, Director of Program Management for Visual Studio
@JohnMont

John is responsible for product design and customer success for all of Visual Studio, C++, C#, VB, JavaScript, and .NET. John has been at Microsoft for 17 years, working in developer technologies the whole time.

F# language and tools update for Visual Studio 2017 version 15.6

$
0
0

With the release of Visual Studio 2017 version 15.6, we’re excited to share updates to the F# language and core library, F# tooling in Visual Studio, and infrastructure updates that concern OSS contributors. Let’s dive in!

F# language and core library updates

Some foundational changes for the F# language and core library have been made, in addition to smaller behavioral language changes and added features to FSharp.Core.

Versioning changes

We have drafted and implemented a versioning change for F# moving forward. There are two major changes involved:

  1. Decoupling tooling artifact versioning for F# language versioning, allowing all tools to rev semantically and align with the multiple release trains for different products that we must deliver on, rather than artificially align with an F# language version.
  2. Aligning the F# language, FSharp.Core binary, and FSharp.Core package to the same versioning scheme, thus creating a consistent versioning model for these assets.

To do this, we will rev the F# language version from 4.1 to 4.5 when Visual Studio 2017 version 15.7 releases, as documented in the linked RFC. We will have a separate blog post which reiterates this change when it is released.

F# Language updates

Numerous bug fixes and smaller improvements have been made in the F# compiler, which included contributions from the F# OSS community.

The primary behavioral change for the F# language in this update is a backwards-compatible change to make F# tuple and System.Tuple types 100% synonymous:

Now that F# tuples and System.Tuples are synonymous, we emit a warning when accessing the .Item and .Rest properties from a System.Tuple:

This fixes inconsistencies with how these two types interact and fixes a regression introduced in VS 2017 version 15.4.

Additionally, some recent regressions in inference order for Statically Resolved Type Parameters (SRTP) have been fixed, and the behavior has been stabilized. Moving forward, the current behavior for SRTP is cemented. This advanced and less-used area of the language is quite touchy as far as the compiler is concerned, and we feel stability is the most important thing moving forward.

F# Core Library updates

The F# Core Library, known as FSharp.Core is distributed via the NuGet package. We include the binary in Visual Studio and a reference to the package in our .NET Core SDK support. The following additions are now available:

  • Support for IReadonlyList and IReadonlyDictionary in relevant F# collections, by Saul Rennison.
  • Support for Async.StartImmediateAsTask, added by Onur Gumus.
  • Support for NativePtr.toByref, added by mjmckp.
  • Support for Seq.transpose/List.transpose/Array.transpose, by Patrick McDonald.
  • The IsSerializable property is now present on F# types for the .NET Standard version of the library.

This smaller set of improvements is an example of the sorts of things we’re looking to have in future FSharp.Core releases.

F# tooling improvements in Visual Studio

The largest focus for us recently has been in improving the development experience for F# with .NET Core SDK projects. This started with initial support in the 15.5 release, where we also made F# installed by default in Visual Studio 2017 wherever .NET Core is required. We have more updates to share now.

New functionality

With the 15.6 release, we’ve focused on bringing up project support for .NET Core SDK-based projects to have full file ordering and folder support:

If you select Add Item on the project node, the file will be added to the top of the list rather than the bottom. If you do this on a folder node, will add it to the top of the list underneath the folder. This fixes a long-standing issue for newcomers where newly-added files which weren’t moved up the list could lead to a compiler error. Add Above/Add Below work as they do in legacy F# projects.

We have one final bit of functionality to implement: initial ordering for a file which has been pasted in. It will currently add the file to the bottom of the list. We are targeting the next release of Visual Studio for this.

Additionally, we have added support for multitargeting.

You can toggle the multitargeting context and get IDE features for a particular target with the Navigation Bars dropdown:

With proper file ordering, folders, and multitargeting support, we’re declaring .NET Core SDK project types as fully supported. In a future release, we’ll also have ASP.NET Core templates with optional Docker support added as a project type you can create with File | New Project.

To see the full list of F# changes going into VS 15.6, see the release notes.

A view into the feature-rich IDE experience for F#

Throughout the year, many IDE features have made it into Visual Studio through open source contributions. Here are some of our favorites:

Go to Definition by clicking on symbols in QuickInfo, by Jakob Majocha:

Analyzers and code fixes for mistyped names and unused values, by Steffen Forkmann and Vasily Kirichenko:

Find all References and Inline Rename across projects, by Vasily Kirichenko:

Completion for items in unopened namespaces, by Vasily Kirichenko:

There are too many amazing features to talk about in a single blog post. You’ll have to explore them yourself! It’s exciting to see the advancements made in the past year, and we look forward to seeing what the next year will bring in Visual Studio. To get started, check out the official docs.

Open source infrastructure

Finally, we’ve made sweeping improvements to our infrastructure and paid off years of technical debt. We’re not done yet, but we’re quite excited about a few things that impact OSS contributors:

  • Nightly builds can now be produced in about an hour, so you can see new features very quickly.
  • We have a new nightly feed for preview versions of F# tools for Visual Studio.
  • All localization files have been made open source (moved from an internal repository), and improvements to translated files can be made directly on GitHub now.
  • We removed our dependency on the Windows 10 SDK and are close to removing all machine state dependencies from our build, with the goal of allowing you to build any part of the repository after running the build script after a clone.
  • We now use a bot to merge changes across concurrent shipping branches so that we can more easily detect any dependency issues for future releases.
  • We now build the F# Compiler Service project as a part of our CI, thanks to Steffen Forkmann

We’re continuing to invest in our infrastructure to keep up with the advancing world of multiple IDEs and SDKs which ship releases on different schedules. We’re also excited to further simplify OSS development by removing all machine state dependencies.

Wrapping up

We’re very excited about these F# updates moving forward, and we look forward to sharing more updates about F#, the entire F# tooling experience, and the greater F# ecosystem soon. Happy F# coding!

Major build speed improvements – Try .NET Core 2.1 Preview 1 today

$
0
0

Head over to the main .NET Core download page and pick up .NET Core 2.1 - Preview 1.

The SDK contains the tools you need to build and run apps with .NET Core and supports Mac, Windows, Ubuntu, Debian, Fedora, CentOS/Oracle, openSUSE, and we even have Docker images for Stretch, Alpine, and more. It's not your grandmother's Microsoft. ;)

Once you've installed it, from a prompt type "dotnet" and note a few new built-in switches:

C:Usersscott> dotnet
Usage: dotnet [options]
Usage: dotnet [path-to-application]
Options:
  -h|--help         Display help.
  --version         Display the current SDK version.
  --list-sdks       Display the installed SDKs.
  --list-runtimes   Display the installed runtimes.
path-to-application:
  The path to an application .dll file to execute.

I'll run it again twice with --list-sdks and --list-runtimes:

C:Usersscott> dotnet --list-sdks
2.1.300-preview1-008174 [C:Program Filesdotnetsdk]
2.1.4 [C:Program Filesdotnetsdk]

C:Usersscott> dotnet --list-runtimes Microsoft.AspNetCore.All 2.1.0-preview1-final [C:Program Filesdotnetshared] Microsoft.AspNetCore.App 2.1.0-preview1-final [C:Program Filesdotnetshared] Microsoft.NETCore.App 2.0.5 [C:Program Filesdotnetshared] Microsoft.NETCore.App 2.1.0-preview1-26216-03 [C:Program Filesdotnetshared]

There's a few interesting things happening here. Youc an see before I had the runtime for .NET Core 2.0.5, and now I also have the 2.1.0 Preview.

It can also be a little confusing that the SDK and Runtime sometimes have different versions, similar to JREs and JDKs. Simply stated - the thing that builds sometimes updates while then thing that runs doesn't. So the .NET Core SDK and compilers might get fixes but the runtime doesn't. I'm told they're going to line things up better. You can read deeply on versioning if you like.

You'll also notice AspNetCore.App, which is a metapackage (package of packages) that's got less than All and helps you make smaller apps.

If you install a beta or preview you might be worried it'll mess stuff up. It won't.

You can type "dotnet new globaljson" and make a file that looks like this! Then "pin" the SDK version you want to use:

{
  "sdk": {
    "version": "2.1.300-preview1-008174"
  }
}

I'll change this to .NET Core's older SDK and try building the .NET Core based Gameboy Emulator in my last post. It's amazing.

Let's see how fast it builds today on .NET 2.0:

C:githubRetro.Net> Measure-Command { dotnet build }
Milliseconds      : 586
Ticks             : 65864065
TotalSeconds      : 6.5864065
TotalMilliseconds : 6586.4065

Ok, about 6.5 seconds on my Surface.

Let's make the SDK version the new .NET Core 2.1 Preview 1 - it has a bunch of build speed improvements.

All I have to do is change the global.json file. Update the sdk version in the global.json and type "dotnet --version" to see that it took.

I can have as many .NET Core SDKs as I like on my machine and I can control what SDK versions are being used on a tree by tree basis. That means you CAN download .NET Core 2.1 and not mess things up if you're paying attention.

C:githubRetro.Net> Measure-Command { dotnet build }
Milliseconds      : 727
Ticks             : 27270864
TotalSeconds      : 2.7270864
TotalMilliseconds : 2727.0864

Hey it's less than 3 seconds. 2.7 in fact! More than twice as fast.

Build times as much as 10x faster

The bigger the app, the faster incremental builds should be. In some cases we will see (by release) 10x improvements.

It's quick to install (and quick to uninstall) and you can control the SDK version (list them with "dotnet --list-sdks") with the global.json.

Please go download the preview and let me know either on Twitter or in the comments what your before and after build times are!


Sponsor: Unleash a faster Python! Supercharge your applications performance on future forward Intel platforms with The Intel Distribution for Python. Available for Windows, Linux, and macOS. Get the Intel® Distribution for Python Now!



© 2017 Scott Hanselman. All rights reserved.
     

DevOps for IoT with Win10 IoT Core, UWP, and VSTS

$
0
0
We often get asked how to do CI/CD for IoT apps using Win10 IoT Core. If you’ve been considering or using Win10 IoT Core, then read on. The Visual Studio Test Platform that ships with Visual Studio 15.6 RTW now supports Testing on Win10 IoT Core. Continuous Integration (CI) and Continuous Delivery (CD) are key... Read More

Just-in-Time VM Access is generally available

$
0
0

Azure Security Center provides several threat prevention mechanisms to help you reduce surface areas susceptible to attack. One of those mechanisms is Just-in-Time (JIT) VM Access. Today we are excited to announce the general availability of Just-in-Time VM Access, which reduces your exposure to network volumetric attacks by enabling you to deny persistent access while providing controlled access to VMs when needed.

When you enable JIT for your VMs, you can create a policy that determines the ports to be protected, how long ports remain open, and approved IP addresses from where these ports can be accessed. The policy helps you stay in control of what users can do when they request access. Requests are logged in the Azure Activity Log, so you can easily monitor and audit access. The policy will also help you quickly identify existing virtual machines that have JIT enabled and virtual machines where JIT is recommended.

This feature is available in the standard pricing tier of Security Center, and you can try Security Center for free for the first 60 days.

To learn more about these features in Security Center, visit our public preview blog and documentation

Migrate your databases to a fully managed service with Azure SQL Database Managed Instance

$
0
0

This blog post was co-authored by Eric Hudson, Senior Product Marketing Manager, CADD & AI.

We’re excited to announce the preview of Azure SQL Database Managed Instance, a new deployment option in SQL Database that streamlines the migration of SQL Server workloads to a fully managed database service. This new Managed Instance deployment option provides full SQL Server engine compatibility and native virtual network (VNET) support. 

Managed Instance

“SQL Managed Instance is that happy medium we were looking for. We needed the power and compatibility of SQL Server, but without the management overhead and cost that comes with running VMs 24x7. Not only will we get that power and ease of management, we’ll also be able to use the Azure Hybrid Benefit, which allows us to use our existing SQL Server licensing through Software Assurance. Developing, deploying, and managing our application is getting a whole lot easier and cheaper with Azure and SQL Managed Instance.”

Robert Shurbet, Senior Software Development Professional, Pivot Technology Solutions

Migrate your databases to a fully-managed service

Azure SQL Database is a fully-managed database service, which means that Microsoft operates SQL Server for you and ensures its availability and performance. SQL Database also includes innovative features that can enhance your security and business continuity to levels you’ve never experienced before. For example, you can rely on built-in high availability (HA), automated backups, and point-in-time restore to ensure business continuity. With a comprehensive security portfolio, you can implement intelligent features like Threat Detection which acts like an alarm system over your database, providing you proactive alerts about potential malicious activities—imagine virtually hands-free administration for hundreds to thousands of your SQL Server databases!

In addition to the benefits of a fully-managed service, Managed Instance enables an instance-scoped programming model that provides high compatibility with on-premises SQL Server (2005 through current versions), reducing or eliminating the need to re-architect applications and manage those databases after they are in the cloud. With Managed Instance, you can continue to rely on the same SQL Server tools you’ve used and loved for years—in the cloud, too. These include features such as native database restore, SQL Agent, Database Mail, Service Broker, Common Language Runtime (CLR), and Change Data Capture. Coming soon, you will be able to migrate your SQL Server Integration Services (SSIS) packages to Azure in a managed environment within Azure Data Factory to increase productivity and lower total cost of ownership.

To streamline your migration to the cloud, Microsoft is also announcing the preview for Azure Database Migration Services (Azure DMS) support for Managed Instance. This enables SQL Server customers to more easily migrate their workloads to SQL Database Managed Instance. Azure DMS reduces the complexity of your cloud migration through a single, comprehensive service. The service supports migrations of homogeneous/heterogeneous source-target pairs, and the guided migration process is easy to understand and implement.  Customers like DocuSign overwhelmingly see the value SQL Database Managed Instance brings to their productivity and scale initiatives.

“We process 1.1 million transactions per day on our platform and our transaction volume approximately doubles every year. We wanted the best of what we do in our data center, with the best of what Azure could bring to it. For us, we found Azure SQL Database was the best way to do it. We basically picked up our existing infrastructure and were able to deploy to Azure within a few seconds. It allows us to scale the business very quickly, with very minimal effort. The holy grail in a move like this is being able to lift your existing platform, move it into Azure without significant amounts of work, without having to re-engineer things.”

Eric Fleischman, Vice President & Chief Architect, DocuSign

To learn more, visit the Database Migration Service page.

Ensure ultimate isolation and security

Managed Instance is fully contained in your virtual network, so it provides the ultimate level of isolation and security for your data. You can now get the benefits of the public cloud while keeping your environment isolated from the public Internet.

The following diagram shows several options to deploy various application topologies completely in Azure or in a hybrid environment, regardless of whether you choose a fully managed service or hosted model for your front-end applications.

Image 2

Any of these options allows connectivity to a SQL endpoint (TDS – tabular data stream), only through private IP addresses, which guarantees the optimal level of isolation for your data. For more details, see How to connect your application to Managed Instance.

Right-size your on-premises workloads in the cloud

Managed Instance introduces a pricing model based on virtual cores (vCores) that provides flexibility in selecting the right level of resources (storage and compute), making it easy to size workloads by comparing to physical cores on-premises. Managed Instance is initially available in a single performance tier, General Purpose, with the Business Critical performance tier coming online during the course of preview. For more information, see the SQL Database pricing page.

Save up to 30%* by leveraging on-premises SQL Server licenses

We’ve made Managed Instance a great economic choice, too, by offering support for Azure Hybrid Benefit for SQL Server. This benefit helps you maximize the value of your current SQL Server licensing investments and accelerate your migration to the cloud. Azure Hybrid Benefit for SQL Server is a benefit exclusive to Azure that enables you to use your SQL Server licenses with Software Assurance to pay a reduced rate on Managed Instance. Additionally, you will be able to migrate your SQL Server Integration Services (SSIS) licenses to Azure Data Factory for a reduced rate with Azure Hybrid Benefit. If you are a SQL Server Enterprise Edition or Standard Edition customer and you have Software Assurance (SA), the Azure Hybrid Benefit for SQL Server can help you save up to 30%* on Managed Instance.

Available today in 20 Azure regions

At the time of this announcement, Managed Instance is initially available in 20 Azure regions: Canada Central, Canada East, Central US, East Asia, East US, East US 2, Japan East, Korea Central, Korea South, North Central US, North Europe, South Central US, South India, Southeast Asia, UK South, West Central US, West Europe, West India, West US, and West US 2. Managed Instance will be made available in more Azure regions through the course of preview.

Get started today!

Try Managed Instance today and let us know what you think! If you’re ready to migrate to a fully-managed database service, enroll into the preview from the Azure portal and accept the Preview terms.

Given high demand, we are managing acceptance into preview to ensure the best experience for our customers. Once you enroll in the portal, our goal is to confirm your status within three business days. Once accepted, you’ll be able to return to the Azure portal to complete the provisioning process. Otherwise, you’ll be invited to re-enroll at a later date.

Once you are in the preview, check out the article Create SQL Database Managed Instance QuickStart to create your first instance and learn how to easily migrate to SQL Database Managed Instance by using Azure DMS. Watch this five-minute video from Lara Rubbelke to learn more about Managed Instance and how it works:

For more details on user scenarios and a list of features and capabilities, visit the Managed Instance documentation page.

We’re excited for you to try SQL Database Managed Instance and experience virtually hands-free administration on all your SQL Server databases!

*Pricing for an Azure SQL Database Managed Instance price on US East price. The savings are calculated for Base rate when compared to fully priced (license included). The SA cost included above is for Open, NL. The SA prices may vary based on EA agreement.

Public preview of Java on App Service, built-in support for Tomcat and OpenJDK

$
0
0

A couple of months back, we announced the general availability of App Service on Linux, starting with support for .NET Core, Node.js, Ruby, PHP, and custom Docker containers. Today, we are glad to share the public preview of Java apps on App Service. This release includes built-in support for Apache Tomcat 8.5/9.0 and OpenJDK 8, making it easy for Java developers to deploy web or API apps to Azure. Just bring your .jar or .war file to Azure App Service and we’ll take care of the capacity provisioning, server maintenance, and load balancing. 

Create and deploy a Java web app easily

Creating a Java web app is easy with App Service using our out-of-box support for Tomcat and OpenJDK. You can deploy your .jar or .war file to Azure and get it up and running at scale with just a few clicks. If you have other preferred images such as Jetty or a different JRE, you can also build your own Docker image and deploy it to App Service. 

Here’s an example of creating a Java web app with a Tomcat image in the portal:

Java on App Service

App Service integrates well with your favorite code repos, IDEs, and CLI tools. You can deploy from all the places you would expect including git, GitHub, BitBucket, FTP, among others. You can also build and publish apps and deploy to App Service from your favorite Java IDE using the Azure toolkit for Eclipse or IntelliJ from Windows, macOS, and Linux. Deployments can also be managed via the command line in Linux, macOS, or Windows with the Azure Command Line Interface (CLI). The CLI runs as a client or in a browser via the Azure portal or at shell.azure.com. The CLI is even available in the Azure management app on your iPhone or Android device.

Extensive support of SDKs and services for Java on Azure

Once your Java app is deployed, you can then extend your applications by adding code from the Azure SDK for Java to work with various Azure services such as Storage, MySQL Database, and PostgreSQL Database. You can also get more granular insights into your app by adding agent calls to your application through the Application Insights Java SDK. Looking for manageable identity services? Use the Azure Active Directory Library for Java to plug into identity services that run in 80 percent of the world’s enterprises. 

These are just a handful examples of SDKs and services available from Azure to boost your productivity and enrich your Java-based web apps. To help you get started, we have dozens of code samples that use the SDK. All of our SDKs run on Windows, macOS, and Linux and are open sourced and available on GitHub

Get started for free

Get started easily with our Java Developer Center. You can also review tutorials and guidance on the Java documentation hub


Making Azure the best place for all your applications and data

$
0
0

The confluence of cloud, data and AI is driving unprecedented change. The ability to utilize data and turn it into breakthrough actions and experiences, is foundational to innovation today. Organizations are using cloud technologies to accelerate the innovation that drives their business—migrating applications and data to the cloud is a key initiative within this strategy.
 
Only Microsoft delivers a consistent and global platform optimized for hybrid cloud for all applications and data. Microsoft provides the flexibility to maintain consistent hybrid cloud environments, while also providing rich choice of migration and modernization options. This consistency enables you to distribute your applications and data workloads on your terms, without the complexity of having to maintain different skillsets, systems, and tools while extracting the insights from data to drive innovation.

Today, we are excited to announce investments that dramatically expand the choice and ROI of moving your SQL Server and open source applications to Azure. SQL Server customers can now try the preview for SQL Database Managed Instance, Azure Hybrid Benefit for SQL Server license benefit which can help customers save up to 30%*, and Azure Database Migration Service preview for Managed Instance. Additionally, we are excited to announce the preview for Apache Tomcat® support in Azure App Service and the general availability of Azure Database for MySQL and PostgreSQL in the coming weeks, making even easier to bring your open source powered applications to Azure.

Migrate SQL Server Workloads

We are making it seamless to move any SQL Server application to Azure without application changes with the new deployment option in Azure SQL Database, Managed Instance. Managed Instance offers full engine compatibility with existing SQL Server deployments including capabilities like SQLAgent, DBMail, and Change Data Capture, to name a few, and built-on the highly productive SQL Database service. Customers who participated in the limited preview are excited by the differentiated value Azure SQL Database delivers versus other options available in the market; including a fully managed service, built-in HA, and semi-autonomous database functionality including performance monitoring and tuning and Threat Detection which serves as an alarm system for your databases.

With the launch of SQL Database Managed Instance, we are also expanding the Azure Hybrid Benefit program to include support for SQL Server. You can now move your on-premises SQL Server licenses with active Software Assurance to Managed Instance and soon your SQL Server Integration Services licenses to Azure Data Factory for discounted pricing, helping you save up to 30%*.

Managed Instance preview customers like Pivot Technology Solutions are loving the combination of the fully managed database service with licensing discounts. “SQL Managed Instance is that happy medium we were looking for. We needed the power and compatibility of SQL Server, but without the management overhead and cost that comes with running VMs 24x7, states Robert Shurbet, Senior Software Development Professional, Pivot Technology Solutions. “Not only will we get that power and ease of management, we’ll also be able to use the Azure Hybrid benefit, which allows us to use our existing SQL Server licensing through Software Assurance. Developing, deploying and managing our application is getting a whole lot easier and cheaper with Azure and SQL Managed Instance.”

And finally, to help you streamline the migration of your databases to SQL Database Managed Instance, we are also excited to expand Azure Database Migration Service to support SQL Database Managed Instance. The Azure Database Migration Service is designed as a seamless, end-to-end solution for moving your on-premises SQL Server to Azure SQL Database as the beginning of a full roadmap of database support. The complete service roadmap will support database migrations across a variety of database sources and destinations —including upcoming support for MySQL and PostgreSQL databases to Azure Database for MySQL and Azure Database for PostgreSQL.

Migrate open source workloads

Making Azure the best destination for all your applications and data requires full support for your open source-powered workloads. Today, we are excited to introduce a preview of built-in support for Apache Tomcat and OpenJDK 8 from Azure App Service, making it easy for Java developers to deploy web applications and APIs to Azure’s market leading PaaS. Just bring your .jar or .war file to App Service and we’ll take care of the capacity provisioning, server maintenance and load balancing. If you have other preferred images such as Jetty and different versions of JRE, you can also build your own Docker image and deploy it to App Service by pointing to your container registry.  Once your Java application is deployed, you can then extend it with the Azure SDK for Java to work with various Azure services such as Storage, Azure Database for MySQL, and Azure Database for PostgreSQL. 

We are also excited to release Azure database services for MySQL and PostgreSQL into general availability in the coming weeks. These preview services have enjoyed tremendous growth since launching into preview last year, proving that developers want database choice delivered on a simple, fully managed service platform that removes the complexities around data availability, protection, and scale. Azure database services for MySQL and PostgreSQL offer the community versions to ensure simplest migration and fastest path to development with these services on Azure. Open source supported services on Azure make it extremely easy for companies like GeekWire to move their workloads to the cloud without disruption to their technology stack and developer culture.

According to Kevin Lisota, GeekWire developer, “the [GeekWire] website is our lifeblood. Reliability of the site if critical. It’s not unusual for the site to experience a 5x, 10x increase in traffic. We migrated the site to Azure and we host the site on a variety of different Azure services to support those really unpredictable traffic spikes. GeekWire runs on WordPress, running Nginx and PHP…the key decision-maker [to migrate to Azure] was when Microsoft launched Azure Database for MySQL so we could also run the database layer together with the whole stack.”

Azure is the best place for your applications and data

Today’s announcements are yet another step towards making Azure the best place for all your apps, data, and infrastructure needs. On February 28, we also announced the general availability of Azure Migrate. Azure Migrate helps you confidently plan cloud migration projects with rich insight, including prescriptive guidance on workload readiness, right-sized resources, and cost estimates. Our vision is to make Azure Migrate the single place for you to discover and assess all your on-premises workloads.  

These investments together deliver deeper platform consistency across on-premises and cloud, a rich open source application framework and database support, and expanded cost-savings for Microsoft customers. We’re committed to making your migration to Azure as smooth as possible, while paving the way for accelerated innovation. Our goal is to build technology that empowers today’s innovators to unleash the power of their applications and data and reimagine possibilities that will improve our world.

Get started today!

We are excited for you to try these exciting new technologies! Get started today and let us know your feedback.

*Pricing for an Azure SQL DB Managed Instance price on US East price. The savings are calculated for Base rate when compared to fully priced (license included). The SA cost included above is for Open, NL. The SA prices may vary based on EA agreement.

R data concepts, for Excel users

$
0
0

Excel users starting to use R likely have some established concepts about data: where it's stored, how functions apply to data, etc. In general, R does things differently to Excel (or any spreadsheet, in fact). In a useful guide, Steph de Silva from Rex Analytics explains the concepts of data management in R and how they differ from Excel, which provides a useful mental model for those making the transition or working in both. 

DataExcelR
Click the image above for the complete guide, but in summary the differences are:

  • Excel stores data in the grid structure of the worksheet, whereas R stores data in individual objects, accessed and manipulated by the R language.
  • In Excel, code is formulas associated with cells of the worksheet. In R, code is functions provided by the R language or that you write yourself.
  • In Excel, you store the results of calculations in the worksheet, but in R results (like data) are stored in objects.

With these differences in mind, Excel users will have a much easier time adapting to R. Find the complete details at the link below.

Rex Analytics: Where do things live in R? R for Excel Users (via Mara Averick)

Join the Bing Maps Team at NAFA 2018 Institute & Expo

$
0
0

Known as the largest event for fleet professionals, NAFA 2018 Institute & Expo, is the perfect place to see the latest in fleet management solutions — and we’ll be there! 

Join us in Anaheim, CA on April 24 - 27. With the release of Bing Maps Fleet Management API collection and our open source Fleet Tracker solution, we have a lot of exciting new developments to share.

Bing Maps Fleet Management APIs

The December release of three new APIs (Truck Routing, Isochrone, and Snap-to-Road) rounded out our Fleet Management collection, and we plan to showcase what each can do at NAFA 2018 I&E. We’ll also look at how the distance matrix API can be used to solve the traveling salesman problem, or complex multi-source and multi-destination routing problems.

Bing Maps Fleet Tracker Solution

Announced in January, Bing Maps Fleet Tracker solution allows you to deploy your own vehicle tracking solution to locate drivers/assets and includes trip detection, geofencing and email notifications. You’ll see firsthand how easy it is to deploy a complete Fleet Tracking application to Azure in 10 minutes and without writing any code, utilizing our deployment portal.

We look forward to demoing all the exciting scenarios possible. Come check us out at Booth# 100. If you are not able to attend NAFA 2018 I&E, no problem.  We will share updates on the blog after the event. Stay tuned.

In the meantime, you can learn more and explore our Fleet Management APIs at https://www.microsoft.com/en-us/maps/fleet-management.

We hope to see you in Anaheim!

- Bing Maps Team

VSTS Update – Mar 5

$
0
0

This week we began rolling out the sprint 131 updates.  You can get the details in the release notes.

There are some good things in this sprint payload (a good start at a new Work items hub, Azure DevOps projects support for VMs, improved GitHub integration, release badges and more).  At the same time, this the most value packed sprint.  The reason is that we have some really big new things that will start coming to light in the next few sprints and much of the team has been busy working on those things.  Stay tuned for great things that are coming.

Brian

Azure SQL Database now offers zone redundant Premium databases and elastic pools

$
0
0

Azure SQL Database Premium tier supports multiple redundant replicas for each database that are automatically provisioned in the same datacenter within a region. This design leverages the SQL Server AlwaysON technology and provides resilience to server failures with 99.99% availability SLA and RPO=0. With the introduction of Azure Availability Zones, we are happy to announce that the SQL Database now offers built-in support of the Availability Zones in its Premium service tier. By placing the individual database replicas to different availability zones, it makes the Premium databases resilient to the much larger set of failures, including catastrophic datacenter outages. The built-in support of Availability Zones further enhances the High Availability (HA) solutions in Azure SQL Database. For more information see High-availability and Azure SQL Database.

To take advantage of this capability, you simply select the zone redundant option during the database or elastic pool creation. You can also enable it for existing databases or pools. If the availability zones are supported in the region where your database or pool is deployed, Azure SQL will automatically reconfigure it without any downtime.

You can use the Azure portal to enable zone redundant database configuration as illustrated on the following diagram.

Multi-AZ

You can also configure this setting by using CLI, PowerShell, or REST API.

The preview of zone redundant Premium databases and pools is currently available in Central US, West Europe, and France Central, with additional regions added over time.

Viewing all 10804 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>