Quantcast
Channel: Category Name
Viewing all 10804 articles
Browse latest View live

Accelerated and Flexible Restore Points with SQL Data Warehouse

$
0
0

We are thrilled to announce that SQL Data Warehouse (SQL DW) has released accelerated and flexible restore points for fast data recovery. SQL DW is a fully managed and secure analytics platform for the enterprise, optimized for running complex queries fast across petabytes of data.

The ability to quickly restore a data warehouse offers customers data protection from accidental corruption, deletion, and disaster recovery. We have seen scenarios where compliance requirements and having multiple test and development environments of a data warehouse enforce stricter capabilities in this area as well. To continue delivering first-class data protection and recovery, we have released the following critical improvements which are seamlessly integrated within the Azure Portal.

Finer granularity for cross region and server restores

You can now restore across regions and servers using any restore point instead of selecting geo redundant backups which are taken every 24 hours. Cross region and server restore is supported for both user-defined or automatic restore points enabling finer granularity for additional data protection. With more restore points available, you can be assured that your data warehouse will be logically consistent when restoring across regions.

flexiblerestorepoints_0

flexiblerestorepoints_1

Fast restore with Enhanced Restore Points

You can now restore your data warehouse in less than 20 minutes regardless of its size within the same region. Fast restore is also applicable for cross server restore if the target server is within the same region. In addition, the current snapshot process has been optimized which reduces the time it takes to create a restore point. For lower performance levels (DWU), this performance enhancement has shown a reduction in the time it takes to create a restore point by at least 2X.

Custom restore configurations

When restoring your data warehouse through the Azure portal, not only can you now change your performance level (DWU) when restoring in the Azure portal but you can also restore to an upgraded Gen2 data warehouse. This enables a side-by-side evaluation before upgrading your Gen1 data warehouse.

flexiblerestorepoints_2

Next Steps


Build secure Oozie workflows in Azure HDInsight with Enterprise Security Package

$
0
0

Customers love to use Hadoop and often rely on Oozie, a workflow and coordination scheduler for Hadoop to accelerate and ease their big data implementation. Oozie is integrated with the Hadoop stack, and it supports several types of Hadoop jobs. However, for users of Azure HDInsight with domain joined clusters, Oozie was not a supported option. To get around this limitation customers had to run Oozie on a regular cluster. This was costly with extra administrative overhead. Today we are happy to announce that customers can now use Oozie in domain-joined Hadoop clusters too.

In domain-joined clusters, authentication happens through Kerberos and fine-grained authorization is through Ranger policies. Oozie supports impersonation of users and a basic authorization model for workflow jobs.

Moreover, Hive server 2 actions submitted as part of an Oozie workflow get logged and are auditable through Ranger too. Fine-grained authorization through ranger will be enforced on the Oozie jobs, only when Ranger policies are present, otherwise coarse-grained authorization based on HDFS (only available on ADLS Gen1) is enforced.

Learn more about how to create an Oozie workflow and submit jobs in a domain joined cluster, and how to use Oozie with Hadoop to define and run a workflow on Linux-based Azure HDInsight.

Try HDInsight now

We hope you take full advantage of today’s announcements and we are excited to see what you will build with Azure HDInsight. Read this developer guide and follow the quick start guide to learn more about implementing these pipelines and architectures on Azure HDInsight. Stay up-to-date on the latest Azure HDInsight news and features by following us on Twitter #HDInsight and @AzureHDInsight. For questions and feedback, please reach out to AskHDInsight@microsoft.com.

About HDInsight

Azure HDInsight is Microsoft’s premium managed offering for running open source workloads on Azure. Today, we are excited to announce several new capabilities across a wide range of OSS frameworks.

Azure HDInsight powers some of the top customer’s mission critical applications ranging in a wide variety of sectors including, manufacturing, retail education, nonprofit, government, healthcare, media, banking, telecommunication, insurance, and many more industries ranging in use cases from ETL to Data Warehousing, from Machine Learning to IoT, and more.

Accelerated and Flexible Restore Points with SQL Data Warehouse

$
0
0

We are thrilled to announce that SQL Data Warehouse (SQL DW) has released accelerated and flexible restore points for fast data recovery. SQL DW is a fully managed and secure analytics platform for the enterprise, optimized for running complex queries fast across petabytes of data.

The ability to quickly restore a data warehouse offers customers data protection from accidental corruption, deletion, and disaster recovery. We have seen scenarios where compliance requirements and having multiple test and development environments of a data warehouse enforce stricter capabilities in this area as well. To continue delivering first-class data protection and recovery, we have released the following critical improvements which are seamlessly integrated within the Azure Portal.

Finer granularity for cross region and server restores

You can now restore across regions and servers using any restore point instead of selecting geo redundant backups which are taken every 24 hours. Cross region and server restore is supported for both user-defined or automatic restore points enabling finer granularity for additional data protection. With more restore points available, you can be assured that your data warehouse will be logically consistent when restoring across regions.

flexiblerestorepoints_0

flexiblerestorepoints_1

Fast restore with Enhanced Restore Points

You can now restore your data warehouse in less than 20 minutes regardless of its size within the same region. Fast restore is also applicable for cross server restore if the target server is within the same region. In addition, the current snapshot process has been optimized which reduces the time it takes to create a restore point. For lower performance levels (DWU), this performance enhancement has shown a reduction in the time it takes to create a restore point by at least 2X.

Custom restore configurations

When restoring your data warehouse through the Azure portal, not only can you now change your performance level (DWU) when restoring in the Azure portal but you can also restore to an upgraded Gen2 data warehouse. This enables a side-by-side evaluation before upgrading your Gen1 data warehouse.

flexiblerestorepoints_2

Next Steps

IoT: the catalyst for better risk management in insurance

$
0
0

Thought leader Matteo Carbone has titled his book All the Insurance Players Will Be Insurtech. He means that insurance companies that embrace digital transformation and technologies will lead the industry. Those technologies include the Internet of Things (IoT), Artificial Intelligence (AI), Machine Learning (ML), and Big Data. Carbone believes that the use of new technologies gives insurers “superpowers” to assess risk more accurately, manage risk continually, and mitigate risk in real-time.

image

The process of getting superpowers is the process of converting IoT data into actionable insights, and using those insights to reduce risk through prevention and mitigation of claim events. As the powers grow, so do the benefits to insurance customers and providers. Insurers can also increase the pace of  customer interaction. It is the growth in the number of interactions produces more data points, and the same data is used to prevent or mitigate risk, while driving the sale of additional services outside the traditional insurance value chain. Remote monitoring and emergency alert services also provide peace-of-mind to the customer. These are only the start of additional services. Insurance companies are now selling a matrix of other services layered on top of the base policy. The income of these additional services often outpaces the sales from the insurance policy. At the same time, customer retention grows , as does the upsell opportunity for other insurance products.

Carbone stresses that IoT is an incredible opportunity for the insurance sector, and the use of IoT data allows insurers to deliver superior value.  IoT adds to the availability of real-time data and increases the number of interactions with the insured. That results in a better customer experience, increasing retention, and expanding relationships. The analysis of the new data streams increases the understanding of risks and the customer base, which can lead to the next generation of products.

Maybe in response to this advice, insurance companies are moving to take advantage of this new stream of IoT data. Recently, an industry analyst estimated that 39 percent of insurers have launched or are piloting connected-home initiatives, and 44 percent consider connected devices to be a driver of future insurance revenue growth.

image

Recommended next steps

Don't start from scratch! Follow these steps to continue your journey towards assessing risk more accurately. 

  • Read Azure IoT Central overview. A fully managed IoT Software-as-a-Service solution that makes it easy to create products that connect the physical and digital worlds. 
  • Create an Azure IOT Central sample application in minutes to become familiar with the tools, using the IoT Center application quickStart.
  • Review the IoT developer’s guide. This guide provides an overview of Azure services that address key IoT solution requirements, as well as a step-by- step progression you can use to build proficiency and move toward a fully functioning solution quickly and easily.

I post regularly about new developments on social media.  If you would like to follow me you can find me on Linkedin and Twitter.

IoT: the catalyst for better risk management in insurance

$
0
0

Thought leader Matteo Carbone has titled his book All the Insurance Players Will Be Insurtech. He means that insurance companies that embrace digital transformation and technologies will lead the industry. Those technologies include the Internet of Things (IoT), Artificial Intelligence (AI), Machine Learning (ML), and Big Data. Carbone believes that the use of new technologies gives insurers “superpowers” to assess risk more accurately, manage risk continually, and mitigate risk in real-time.

image

The process of getting superpowers is the process of converting IoT data into actionable insights, and using those insights to reduce risk through prevention and mitigation of claim events. As the powers grow, so do the benefits to insurance customers and providers. Insurers can also increase the pace of  customer interaction. It is the growth in the number of interactions produces more data points, and the same data is used to prevent or mitigate risk, while driving the sale of additional services outside the traditional insurance value chain. Remote monitoring and emergency alert services also provide peace-of-mind to the customer. These are only the start of additional services. Insurance companies are now selling a matrix of other services layered on top of the base policy. The income of these additional services often outpaces the sales from the insurance policy. At the same time, customer retention grows , as does the upsell opportunity for other insurance products.

Carbone stresses that IoT is an incredible opportunity for the insurance sector, and the use of IoT data allows insurers to deliver superior value.  IoT adds to the availability of real-time data and increases the number of interactions with the insured. That results in a better customer experience, increasing retention, and expanding relationships. The analysis of the new data streams increases the understanding of risks and the customer base, which can lead to the next generation of products.

Maybe in response to this advice, insurance companies are moving to take advantage of this new stream of IoT data. Recently, an industry analyst estimated that 39 percent of insurers have launched or are piloting connected-home initiatives, and 44 percent consider connected devices to be a driver of future insurance revenue growth.

image

Recommended next steps

Don't start from scratch! Follow these steps to continue your journey towards assessing risk more accurately. 

  • Read Azure IoT Central overview. A fully managed IoT Software-as-a-Service solution that makes it easy to create products that connect the physical and digital worlds. 
  • Create an Azure IOT Central sample application in minutes to become familiar with the tools, using the IoT Center application quickStart.
  • Review the IoT developer’s guide. This guide provides an overview of Azure services that address key IoT solution requirements, as well as a step-by- step progression you can use to build proficiency and move toward a fully functioning solution quickly and easily.

I post regularly about new developments on social media.  If you would like to follow me you can find me on Linkedin and Twitter.

Azure.Source – Volume 41

$
0
0

Now in preview

Azure Service Fabric Mesh is now in public preview - Azure Service Fabric Mesh is a fully-managed service that enables developers to deploy and operate containerized applications without having to manage VMs, storage or networking configuration, while keeping the enterprise-grade reliability, scalability, and mission-critical performance of Service Fabric. Service Fabric Mesh supports both Windows and Linux containers, so you can develop with the programming language and framework of your choice. The public preview of Service Fabric Mesh is available in three Azure regions - US West, US East, and Europe West. We will expand the service to other Azure regions in the coming months.

Azure Friday | Azure Service Fabric Mesh preview - Chacko Daniel joins Scott Hanselman to discuss Azure Service Fabric Mesh, which offers the same reliability, mission-critical performance and scale customers get with Service Fabric, but no more overhead of cluster management and patching operations. Service Fabric Mesh supports both Windows and Linux containers allowing you to develop with any programming language and framework of your choice.

Azure Security Center is now integrated into the subscription experience - Azure Security Center is available in public preview from within the subscription experience in the Azure portal. The new Security tab for your subscription provides a quick view into the security posture of your subscription, enabling you to discover and assess the security of your resources in that subscription and take action.

Speech services July 2018 update - The 0.5.0 update to the Speech SDK for Azure Cognitive Services was just released in preview. This update adds support for UWP (on Windows version 1709), .NET Standard 2.0 (on Windows), and Java on Android 6.0 (Marshmallow, API level 23) or later. It also includes feature changes and bug fixes. Most notably, it now supports long-running audio and automatic reconnection. This will make the Speech service more resilient in the event of time-out, network failures, or service errors. Improvements to error messages make it easier to handle errors.

Also in preview

Now generally available

Score one for the IT Pro: Azure File Sync is now generally available! - Azure File Sync transforms your Windows Server machines into a quick cache of your Azure file share. You can use any protocol that's available on Windows Server to access your data locally, including SMB, Network File System (NFS), and File Transfer Protocol Service (FTPS). You can have as many caches as you need across the world. You must have an Azure storage account and an Azure file share in the same region that you want to deploy Azure File Sync. Region availability for Azure File Sync.

Also generally available

News and updates

Announcing the Azure Cloud Shell editor in collaboration with Visual Studio Code - Through collaboration with the Visual Studio Code team and their open-source Monaco project, the same web-standards based editor that powers Visual Studio Code is now integrated directly into Cloud Shell. The Monaco code editor brings features like syntax coloring, auto completion, and code snippets. The new Cloud Shell integration includes a file explorer to easily navigate the Cloud Shell file system for seamless file exploration. This enables a rich editing workflow by simply typing “code .” to open the editor’s file explorer from any Cloud Shell web-based experience.

Azure Friday | Azure Cloud Shell editor - Justin Luk joins Scott Hanselman to show the new Azure Cloud Shell editor. Since its launch, Cloud Shell included a variety of editors (vi, emacs, and nano) for editing files. In collaboration with the Visual Studio Code team and their open-source Monaco project, the same web standards-based editor that powers Visual Studio Code is now integrated directly into Cloud Shell.

Spring Data Gremlin for Azure Cosmos DB Graph API - Spring Data Gremlin is now available on Maven Central with source code available on GitHub. Spring Data Gremlin supports an annotation-oriented programming model to simplify the mapping to the database entity. It also provides supports to create database queries based on Spring Data Repository.

Azure cost forecast API and other updates - The Azure Consumption APIs give you programmatic access to cost and usage data for your Azure resources. The latest update adds a Forecasts API, location and service data normalization, price sheet mapping to usage details, a flag for charges that are monetary commit ineligible, additional properties for identifying late arriving data, and PowerShell support. These APIs currently only support Enterprise Enrollments and Web Direct Subscriptions (with a few exceptions).

Additional news and updates

The Open Source Show

The Open Source Show | Getting Started with Cloud Native Infrastructure - Cloud native. What does it really mean? Why do you need it? How and where should you get started? Justin Garrison, Cloud Native Infrastructure, co-author, and Bridget Kromhout cover the what, when, why, when, and how of cloud native, including what Justin's learned from studying massive organizations and CNCF projects. We close it out with "Cloud Native Fundamentals: Kubernetes in 90 seconds" from AKS Program Manager, Ralph Squillace.

Azure Friday

Azure Friday | Event-based data integration with Azure Data Factory - Gaurav Malhotra joins Scott Hanselman to discuss Event-driven architecture (EDA), which is a common data integration pattern that involves production, detection, consumption, and reaction to events. Learn how you can do event-based data integration using Azure Data Factory.

Technical content and training

Foretell and prevent downtime with predictive maintenance - Learn how Azure can provide your predictive maintenance solution by using data about previous breakdowns to model when failures are about to occur, and intervene just as sensors detect the same conditions. Until recently this has not been a realistic option, as modeling did not exist, and real-time processing power was too expensive. See how Azure solves that problem.

Getting started with IoT: how to connect, secure, and manage your “things” - The value of the data an IoT solution generates depends largely on how effectively you deploy and manage the devices. Learn how to get started with securing, provisioning, and managing your devices with Azure IoT Hub.

The Azure Podcast

The Azure Podcast: Episode 238 - Serial Console - Evan and Kendall talk to Craig Wiand, a Principal Engineer in the Azure team, about the new Serial Console feature available for VMs.

Customers and partners

Internet of Things Show | A Perspective on Industrial IoT Security by TrendMicro - Security is critical in IoT applications, but how different is it from (or similar to) traditional IT security? Richard Ku, expert in Industrial IoT Security for Enterprise OT environment at TrendMicro visited the IoT Show to share his perspective on what you need to pay attention to when building IoT applications.

R3 on Azure: Launch of Corda Enterprise v3.1 - R3 launched their Corda Enterprise v3.1 offering to the Azure Marketplace with a free trial offer, giving you the opportunity to kick the tires before you buy. Corda Enterprise is the mission critical version of Corda, the open source blockchain platform. Enterprise brings the higher performance, high availability and other capabilities demanded by enterprise organizations.

Globally replicated data lakes with LiveData using WANdisco on Azure - WANdisco enables globally replicated data lakes on Azure for analytics over the freshest data. With WANdisco Fusion, you can make data that you have used in other large-scale analytics platforms available in Azure Blob Storage, ADLS Gen1 and Gen2 without downtime or disruption to your existing environment.

Intelligent Healthcare with Azure Bring Your Own Key (BYOK) technology - Learn how Change Healthcare implemented Azure SQL Database Transparent Data Encryption (TDE) with BYOK support. TDE with BYOK encrypts databases, log files and backups when written to disk, which protects data at rest from unauthorized access.

Azure Marketplace June container offers - The Azure Marketplace is the premier destination for all your software needs – certified and optimized to run on Azure. In June we published 42 container offers from Bitnami, including container images for Cassandra, Kafka, and Open Service Broker for Azure.

Industries

Blockchain as a tool for anti-fraud - Healthcare costs are skyrocketing. There are many factors driving the high cost of healthcare, and one of them is fraud. There are two root vulnerabilities in healthcare organizations: insufficient protection of data integrity, and a lack of transparency. Learn how you can use blockchain as a tool for anti-fraud.

Azure This Week

A Cloud Guru | Azure This Week - 20 July 2018 - In this episode of Azure This Week, Lars looks at the public preview of Azure Data Box Disk, Dev Spaces for Azure Kubernetes Services and the general availability of Azure DevOps Projects.

Azure.Source – Volume 41

$
0
0

Now in preview

Azure Service Fabric Mesh is now in public preview - Azure Service Fabric Mesh is a fully-managed service that enables developers to deploy and operate containerized applications without having to manage VMs, storage or networking configuration, while keeping the enterprise-grade reliability, scalability, and mission-critical performance of Service Fabric. Service Fabric Mesh supports both Windows and Linux containers, so you can develop with the programming language and framework of your choice. The public preview of Service Fabric Mesh is available in three Azure regions - US West, US East, and Europe West. We will expand the service to other Azure regions in the coming months.

Azure Friday | Azure Service Fabric Mesh preview - Chacko Daniel joins Scott Hanselman to discuss Azure Service Fabric Mesh, which offers the same reliability, mission-critical performance and scale customers get with Service Fabric, but no more overhead of cluster management and patching operations. Service Fabric Mesh supports both Windows and Linux containers allowing you to develop with any programming language and framework of your choice.

Azure Security Center is now integrated into the subscription experience - Azure Security Center is available in public preview from within the subscription experience in the Azure portal. The new Security tab for your subscription provides a quick view into the security posture of your subscription, enabling you to discover and assess the security of your resources in that subscription and take action.

Speech services July 2018 update - The 0.5.0 update to the Speech SDK for Azure Cognitive Services was just released in preview. This update adds support for UWP (on Windows version 1709), .NET Standard 2.0 (on Windows), and Java on Android 6.0 (Marshmallow, API level 23) or later. It also includes feature changes and bug fixes. Most notably, it now supports long-running audio and automatic reconnection. This will make the Speech service more resilient in the event of time-out, network failures, or service errors. Improvements to error messages make it easier to handle errors.

Also in preview

Now generally available

Score one for the IT Pro: Azure File Sync is now generally available! - Azure File Sync transforms your Windows Server machines into a quick cache of your Azure file share. You can use any protocol that's available on Windows Server to access your data locally, including SMB, Network File System (NFS), and File Transfer Protocol Service (FTPS). You can have as many caches as you need across the world. You must have an Azure storage account and an Azure file share in the same region that you want to deploy Azure File Sync. Region availability for Azure File Sync.

Also generally available

News and updates

Announcing the Azure Cloud Shell editor in collaboration with Visual Studio Code - Through collaboration with the Visual Studio Code team and their open-source Monaco project, the same web-standards based editor that powers Visual Studio Code is now integrated directly into Cloud Shell. The Monaco code editor brings features like syntax coloring, auto completion, and code snippets. The new Cloud Shell integration includes a file explorer to easily navigate the Cloud Shell file system for seamless file exploration. This enables a rich editing workflow by simply typing “code .” to open the editor’s file explorer from any Cloud Shell web-based experience.

Azure Friday | Azure Cloud Shell editor - Justin Luk joins Scott Hanselman to show the new Azure Cloud Shell editor. Since its launch, Cloud Shell included a variety of editors (vi, emacs, and nano) for editing files. In collaboration with the Visual Studio Code team and their open-source Monaco project, the same web standards-based editor that powers Visual Studio Code is now integrated directly into Cloud Shell.

Spring Data Gremlin for Azure Cosmos DB Graph API - Spring Data Gremlin is now available on Maven Central with source code available on GitHub. Spring Data Gremlin supports an annotation-oriented programming model to simplify the mapping to the database entity. It also provides supports to create database queries based on Spring Data Repository.

Azure cost forecast API and other updates - The Azure Consumption APIs give you programmatic access to cost and usage data for your Azure resources. The latest update adds a Forecasts API, location and service data normalization, price sheet mapping to usage details, a flag for charges that are monetary commit ineligible, additional properties for identifying late arriving data, and PowerShell support. These APIs currently only support Enterprise Enrollments and Web Direct Subscriptions (with a few exceptions).

Additional news and updates

The Open Source Show

The Open Source Show | Getting Started with Cloud Native Infrastructure - Cloud native. What does it really mean? Why do you need it? How and where should you get started? Justin Garrison, Cloud Native Infrastructure, co-author, and Bridget Kromhout cover the what, when, why, when, and how of cloud native, including what Justin's learned from studying massive organizations and CNCF projects. We close it out with "Cloud Native Fundamentals: Kubernetes in 90 seconds" from AKS Program Manager, Ralph Squillace.

Azure Friday

Azure Friday | Event-based data integration with Azure Data Factory - Gaurav Malhotra joins Scott Hanselman to discuss Event-driven architecture (EDA), which is a common data integration pattern that involves production, detection, consumption, and reaction to events. Learn how you can do event-based data integration using Azure Data Factory.

Technical content and training

Foretell and prevent downtime with predictive maintenance - Learn how Azure can provide your predictive maintenance solution by using data about previous breakdowns to model when failures are about to occur, and intervene just as sensors detect the same conditions. Until recently this has not been a realistic option, as modeling did not exist, and real-time processing power was too expensive. See how Azure solves that problem.

Getting started with IoT: how to connect, secure, and manage your “things” - The value of the data an IoT solution generates depends largely on how effectively you deploy and manage the devices. Learn how to get started with securing, provisioning, and managing your devices with Azure IoT Hub.

The Azure Podcast

The Azure Podcast: Episode 238 - Serial Console - Evan and Kendall talk to Craig Wiand, a Principal Engineer in the Azure team, about the new Serial Console feature available for VMs.

Customers and partners

Internet of Things Show | A Perspective on Industrial IoT Security by TrendMicro - Security is critical in IoT applications, but how different is it from (or similar to) traditional IT security? Richard Ku, expert in Industrial IoT Security for Enterprise OT environment at TrendMicro visited the IoT Show to share his perspective on what you need to pay attention to when building IoT applications.

R3 on Azure: Launch of Corda Enterprise v3.1 - R3 launched their Corda Enterprise v3.1 offering to the Azure Marketplace with a free trial offer, giving you the opportunity to kick the tires before you buy. Corda Enterprise is the mission critical version of Corda, the open source blockchain platform. Enterprise brings the higher performance, high availability and other capabilities demanded by enterprise organizations.

Globally replicated data lakes with LiveData using WANdisco on Azure - WANdisco enables globally replicated data lakes on Azure for analytics over the freshest data. With WANdisco Fusion, you can make data that you have used in other large-scale analytics platforms available in Azure Blob Storage, ADLS Gen1 and Gen2 without downtime or disruption to your existing environment.

Intelligent Healthcare with Azure Bring Your Own Key (BYOK) technology - Learn how Change Healthcare implemented Azure SQL Database Transparent Data Encryption (TDE) with BYOK support. TDE with BYOK encrypts databases, log files and backups when written to disk, which protects data at rest from unauthorized access.

Azure Marketplace June container offers - The Azure Marketplace is the premier destination for all your software needs – certified and optimized to run on Azure. In June we published 42 container offers from Bitnami, including container images for Cassandra, Kafka, and Open Service Broker for Azure.

Industries

Blockchain as a tool for anti-fraud - Healthcare costs are skyrocketing. There are many factors driving the high cost of healthcare, and one of them is fraud. There are two root vulnerabilities in healthcare organizations: insufficient protection of data integrity, and a lack of transparency. Learn how you can use blockchain as a tool for anti-fraud.

Azure This Week

A Cloud Guru | Azure This Week - 20 July 2018 - In this episode of Azure This Week, Lars looks at the public preview of Azure Data Box Disk, Dev Spaces for Azure Kubernetes Services and the general availability of Azure DevOps Projects.

Azure Marketplace new offers: June 16–30

$
0
0

We continue to expand the Azure Marketplace ecosystem. From June 16 to 30, 22 new offers successfully met the onboarding criteria and went live. See details of the new offers below:

Virtual Machines

Altair HyperWorks Unlimited Virtual Appliance-BYOL

Altair HyperWorks Unlimited Virtual Appliance-BYOL: Altair’s HyperWorks Unlimited Virtual Appliance provides access to engineering Software-as-a-Service, allowing companies to expand computer-aided engineering capacity.

Autopsy

Autopsy: Autopsy is a digital forensics platform and graphical interface to The Sleuth Kit and other digital forensics tools. It is used by law enforcement, military, and corporate examiners to investigate what happened on a computer.

Denodo Platform 7.0

Denodo Platform 7.0: Denodo is a data virtualization product that integrates data from your Azure sources and your Software-as-a-Service applications to deliver a standards-based data gateway, making it quick and easy for users to access and use your cloud-hosted data.

F5 Advanced WAF

F5 Advanced WAF: The Advanced WAF provides robust web application firewall protection by securing applications against threats, including layer 7 DDoS attacks, malicious bot traffic, all Open Web Application Security Project top 10 threats, and API protocol vulnerabilities.

Hardened IIS On Windows Server 2012 R2

Hardened IIS On Windows Server 2012 R2: Internet Information Services (IIS) is an extensible web server created by Microsoft for use with the Windows NT family. This image is made for customers looking to deploy a self-managed Community edition on a hardened kernel.

Hardened IIS On Windows Server 2016

Hardened IIS On Windows Server 2016: Internet Information Services (IIS) is an extensible web server created by Microsoft for use with the Windows NT family. This image is made for customers looking to deploy a self-managed Community edition on a hardened kernel.

Hardened PHP MYSQL and IIS on Windows Server 2016

Hardened PHP and IIS on Windows Server 2012 R2: PHP is a server-side scripting language for web development, and it's used as a general-purpose programming language. This image is for customers looking to deploy a self-managed Community edition on a hardened kernel.

Hardened WebServer On Windows Server 2016

Hardened WebServer On Windows Server 2016: Internet Information Services (IIS) is an extensible web server created by Microsoft for use with the Windows NT family. This image is for customers looking to deploy a self-managed Community edition on a hardened kernel.

Web Applications

Aporeto Enterprise

Aporeto Enterprise: Aporeto provides network security, runtime protection, and API access control for cloud applications. These security capabilities are powered by application identity, a distinctive approach to uniquely identify and protect application resources.

EXASOL Analytics Database BYOL

EXASOL Analytics Database (BYOL): Exasol is an MPP database designed for business intelligence, analytics, and reporting. Exasol scales from 10s of gigabytes to 100s of terabytes of data, is quick to implement, and delivers extreme performance without cost and complexity.

Reblaze

Reblaze: Reblaze provides comprehensive, cloud-based intelligent web security that defends against DoS/DDoS, hacking and breaches, bots, scraping, and other web threats. It includes auto-scaling bandwidth, automated geo-weighting and distribution of traffic, and more.

Container Solutions

Hardened PHP MYSQL and IIS on Windows Server 2016

Hardened PHP MYSQL and IIS on Windows Server 2016: PHP is a server-side scripting language for web development, and it's a general-purpose programming language. This image is for customers looking to deploy a self-managed Community edition on a hardened kernel.

Consulting Services

Azure Lift-and-Shift FastStart 10-Day Workshop

Azure Lift-and-Shift FastStart: 10-Day Workshop: When it comes to costly on-premises infrastructure, the “lift-and-shift” approach can be a cost-saving method to take in-house apps into the cloud. This workshop will plan the feasibility of migrating to Microsoft Azure.

Azure Migrate 5-Day Assessment

Azure Migrate: 10-Day Assessment: This offering is a fixed-fee, 10-day extended engagement by Softlanding to get Azure Migrate installed while assessing the client’s on-premises environment.

Azure Migrate 10-Day Assessment

Azure Migrate: 5-Day Assessment: This five-day engagement will involve installing Azure Migrate and discovering and assessing on-premises vCenter virtual machines. It will include an assessment of the client's on-premises environment.

Azure VM Backup FastStart 10-Day Implementation

Azure VM Backup FastStart: 10-Day Implementation: Designed to help organizations further protect cloud-based workloads, Softlanding’s Azure Virtual Machine Backup offering allows organizations to create and configure a simple and cost-effective Backup-as-a-Service solution.

Azure 1-Wk Assessment

Azure: 1-Wk Assessment: Phidiax's cloud assessment will provide a custom, detailed view of how Microsoft Azure will increase your organization’s agility and save on expenditures. Phidiax assists organizations with assessment, planning, migration, implementation, and support.

Azure 1-Wk PoC

Azure: 1-Wk PoC: You will receive a professional assessment by Phidiax's solution experts. This will include a customized professional implementation and proof of concept for Microsoft Azure solutions.

Customer 360 Powered by Zero2Hero

Customer 360 Powered by Zero2Hero: Deploy a proof of concept with defined and measurable business value in this offer from Bardess Group. Customer 360 Powered by Zero2Hero builds a more comprehensive view of customers by leveraging data of all types, sizes, and velocities.

DevOps Kickstart 3-Wk Implementation

DevOps Kickstart: 3-Wk Implementation: For organizations considering how automation and DevOps can be leveraged for agility, Diaxion will develop a roadmap to enable the DevOps initiative. A recommendation report will include an on-site workshop with the client.

Horizontal Portal with AEM - 1-Hr Briefing

Horizontal Portal with AEM: 1-Hr Briefing: This overview will cover LTI's horizontal portal solution, which acts as a personalized point of access, complete with an integrated user journey and customizable capabilities.

KPMG Adoxio Portal Azure 4-Wk Assessment

KPMG Adoxio Portal Azure: 4-Wk Assessment: KPMG Adoxio is offering a portal assessment service to assist customers in making informed decisions on their path forward when Microsoft ends support of Adxstudio Portals version 7 in August 2018.


Azure Marketplace new offers: June 16–30

$
0
0

We continue to expand the Azure Marketplace ecosystem. From June 16 to 30, 22 new offers successfully met the onboarding criteria and went live. See details of the new offers below:

Virtual Machines

Altair HyperWorks Unlimited Virtual Appliance-BYOL

Altair HyperWorks Unlimited Virtual Appliance-BYOL: Altair’s HyperWorks Unlimited Virtual Appliance provides access to engineering Software-as-a-Service, allowing companies to expand computer-aided engineering capacity.

Autopsy

Autopsy: Autopsy is a digital forensics platform and graphical interface to The Sleuth Kit and other digital forensics tools. It is used by law enforcement, military, and corporate examiners to investigate what happened on a computer.

Denodo Platform 7.0

Denodo Platform 7.0: Denodo is a data virtualization product that integrates data from your Azure sources and your Software-as-a-Service applications to deliver a standards-based data gateway, making it quick and easy for users to access and use your cloud-hosted data.

F5 Advanced WAF

F5 Advanced WAF: The Advanced WAF provides robust web application firewall protection by securing applications against threats, including layer 7 DDoS attacks, malicious bot traffic, all Open Web Application Security Project top 10 threats, and API protocol vulnerabilities.

Hardened IIS On Windows Server 2012 R2

Hardened IIS On Windows Server 2012 R2: Internet Information Services (IIS) is an extensible web server created by Microsoft for use with the Windows NT family. This image is made for customers looking to deploy a self-managed Community edition on a hardened kernel.

Hardened IIS On Windows Server 2016

Hardened IIS On Windows Server 2016: Internet Information Services (IIS) is an extensible web server created by Microsoft for use with the Windows NT family. This image is made for customers looking to deploy a self-managed Community edition on a hardened kernel.

Hardened PHP MYSQL and IIS on Windows Server 2016

Hardened PHP and IIS on Windows Server 2012 R2: PHP is a server-side scripting language for web development, and it's used as a general-purpose programming language. This image is for customers looking to deploy a self-managed Community edition on a hardened kernel.

Hardened WebServer On Windows Server 2016

Hardened WebServer On Windows Server 2016: Internet Information Services (IIS) is an extensible web server created by Microsoft for use with the Windows NT family. This image is for customers looking to deploy a self-managed Community edition on a hardened kernel.

Web Applications

Aporeto Enterprise

Aporeto Enterprise: Aporeto provides network security, runtime protection, and API access control for cloud applications. These security capabilities are powered by application identity, a distinctive approach to uniquely identify and protect application resources.

EXASOL Analytics Database BYOL

EXASOL Analytics Database (BYOL): Exasol is an MPP database designed for business intelligence, analytics, and reporting. Exasol scales from 10s of gigabytes to 100s of terabytes of data, is quick to implement, and delivers extreme performance without cost and complexity.

Reblaze

Reblaze: Reblaze provides comprehensive, cloud-based intelligent web security that defends against DoS/DDoS, hacking and breaches, bots, scraping, and other web threats. It includes auto-scaling bandwidth, automated geo-weighting and distribution of traffic, and more.

Container Solutions

Hardened PHP MYSQL and IIS on Windows Server 2016

Hardened PHP MYSQL and IIS on Windows Server 2016: PHP is a server-side scripting language for web development, and it's a general-purpose programming language. This image is for customers looking to deploy a self-managed Community edition on a hardened kernel.

Consulting Services

Azure Lift-and-Shift FastStart 10-Day Workshop

Azure Lift-and-Shift FastStart: 10-Day Workshop: When it comes to costly on-premises infrastructure, the “lift-and-shift” approach can be a cost-saving method to take in-house apps into the cloud. This workshop will plan the feasibility of migrating to Microsoft Azure.

Azure Migrate 5-Day Assessment

Azure Migrate: 10-Day Assessment: This offering is a fixed-fee, 10-day extended engagement by Softlanding to get Azure Migrate installed while assessing the client’s on-premises environment.

Azure Migrate 10-Day Assessment

Azure Migrate: 5-Day Assessment: This five-day engagement will involve installing Azure Migrate and discovering and assessing on-premises vCenter virtual machines. It will include an assessment of the client's on-premises environment.

Azure VM Backup FastStart 10-Day Implementation

Azure VM Backup FastStart: 10-Day Implementation: Designed to help organizations further protect cloud-based workloads, Softlanding’s Azure Virtual Machine Backup offering allows organizations to create and configure a simple and cost-effective Backup-as-a-Service solution.

Azure 1-Wk Assessment

Azure: 1-Wk Assessment: Phidiax's cloud assessment will provide a custom, detailed view of how Microsoft Azure will increase your organization’s agility and save on expenditures. Phidiax assists organizations with assessment, planning, migration, implementation, and support.

Azure 1-Wk PoC

Azure: 1-Wk PoC: You will receive a professional assessment by Phidiax's solution experts. This will include a customized professional implementation and proof of concept for Microsoft Azure solutions.

Customer 360 Powered by Zero2Hero

Customer 360 Powered by Zero2Hero: Deploy a proof of concept with defined and measurable business value in this offer from Bardess Group. Customer 360 Powered by Zero2Hero builds a more comprehensive view of customers by leveraging data of all types, sizes, and velocities.

DevOps Kickstart 3-Wk Implementation

DevOps Kickstart: 3-Wk Implementation: For organizations considering how automation and DevOps can be leveraged for agility, Diaxion will develop a roadmap to enable the DevOps initiative. A recommendation report will include an on-site workshop with the client.

Horizontal Portal with AEM - 1-Hr Briefing

Horizontal Portal with AEM: 1-Hr Briefing: This overview will cover LTI's horizontal portal solution, which acts as a personalized point of access, complete with an integrated user journey and customizable capabilities.

KPMG Adoxio Portal Azure 4-Wk Assessment

KPMG Adoxio Portal Azure: 4-Wk Assessment: KPMG Adoxio is offering a portal assessment service to assist customers in making informed decisions on their path forward when Microsoft ends support of Adxstudio Portals version 7 in August 2018.

Windows UI Library Preview released!

$
0
0

We’re excited to announce the first preview release of the Windows UI Library!

The Windows UI Library (or WinUI for short) is a new way to get and use Fluent controls and styles for building Windows 10 UWP apps: via NuGet packages.

The WinUI NuGet packages contain new and popular UWP XAML controls and features which are backward-compatible on a range of Windows 10 versions, from the latest insider flights down to the Anniversary Update (1607).

Benefits of the Windows UI Library

Previously, the UWP XAML app development framework was shipped and updated solely as part of Windows and the SDK. In order to get new features or fixes, you had to wait for a new version of Windows. And just as importantly, you had to wait for your users to update their OS as well.

Using the new WinUI NuGet package(s) has two main benefits for UWP XAML app developers:

  1. It lets you start building and shipping apps with new UWP XAML features immediately: since the NuGet packages are backward-compatible with a wider range of Windows 10 versions, you no longer have to wait for your users to update their OS before they can run your app with the latest features.
  2. It makes it simpler to build version adaptive apps: you usually won’t have to include version checks or conditional XAML markup to use controls or features in a WinUI package when you’re building an app that targets multiple versions of Windows 10.

Release Details

The Windows UI Library is built and maintained by the same engineering team responsible for the standard Windows 10 SDK, and follows the same development and testing process. However, shipping via NuGet gives our engineering team more flexibility in how we plan, release, and update the UWP XAML platform.

This initial release includes previews of the first two Windows UI packages:

  • Microsoft.UI.Xaml
  • Microsoft.UI.Xaml.Core.Direct

These are prerelease packages, similar to a Windows Insider SDK. A future release will include “RTM” versions of these packages, and may have additional features or breaking changes. The prerelease versions are fully functional for testing and evaluation purposes, but we recommend using the RTM versions for production apps.

Not all of the XAML platform is in WinUI. For future versions we’re evaluating moving more of the XAML platform to WinUI packages, and are also exploring options for moving our development process to an open source model on GitHub.

Until then, we look forward to hearing any issues or feedback you may have through the Feedback Hub or through the UWP UserVoice forum!

Getting Started

You can find more info on how to install and get started with the Windows UI Library preview in the documentation, which will be expanded in the coming weeks:

The NuGet packages for WinUI can be found via the NuGet Package Manager in Visual Studio, or by visiting the package pages:

The dev branch of the Xaml Controls Gallery sample app available in the Microsoft Store has also been updated to use WinUI as a usage example:

Note that to use the Windows UI Library, your UWP project’s Min version must be 14393 or higher and the Target version must be 17134 or higher.

Windows UI Library Contents

1) Existing controls and features, with wider version support

The Microsoft.UI.Xaml package includes a number of controls that are also part of the standard Windows 10 SDK, plus Fluent design elements like Reveal highlighting and the Acrylic material.

For example, apps often use a TreeView control to display and navigate a hierarchical list. A new native UWP Windows.UI.Xaml.Controls.TreeView control was added to Windows 10 as part of the April 2018 Update (1803), which means end-users have to install the April 2018 Update before they can run apps that use it. And not every user may be running that version of Windows 10 yet, especially in enterprise environments which evaluate and roll out updates at a slower pace.

However with WinUI, TreeView is also available as a separate Microsoft.UI.Xaml.Controls.TreeView class which provides the same functionality as the default Windows 10 SDK, with added benefits:

  1. It works on a wider range of previous Windows 10 versions
  2. It includes new functionality which hasn’t been released in the standard SDK version yet

Apps can freely use both the default SDK version and the WinUI version of the same control. But you may as well use the WinUI version of a control when there’s one available since there’s little downside.

More info on the package contents is available in the documentation:

2) New and preview controls

The Microsoft.UI.Xaml package also includes new controls that will likely be included in the standard SDK with a future release of Windows, like CommandBarFlyout and MenuBar.

Prerelease versions of the packages may additionally contain early prerelease features and controls. For example, this first prerelease version contains some controls like Repeater, Scroller, and LayoutPanel: these controls are functional, but we’re still working on them and so they likely won’t be included in the first upcoming RTM release of WinUI.

3) XamlDirect

The Microsoft.UI.Xaml.Core.Direct package is a preview of a new standalone WinRT library for middleware developers.

It provides lower level access to most of the XAML framework, enabling better CPU and working set performance for middleware components.

It is functional on the Windows April 2018 Update and previous versions of Windows 10, but will provide improved performance on current RS5 insider builds and upcoming versions of Windows 10!

The post Windows UI Library Preview released! appeared first on Windows Developer Blog.

AI, Machine Learning and Data Science Roundup: July 2018

$
0
0

A monthly roundup of news about Artificial Intelligence, Machine Learning and Data Science. This is an eclectic collection of interesting blog posts, software announcements and data applications I've noted over the past month or so.

Open Source AI, ML & Data Science News

A review of the current state of the Julia project, including performance comparisons with Go, Python and R. 

Python 3.7.0 has been released. Also, Guido van Rossum steps down as the project's "Benevolent Dictator for Life".

Tensorflow 1.9 released, featuring a new getting started guide for Keras.

Apache OpenNLP 1.9.0 released, adding several improvements to the natural language processing library.

The PYPL Popularity of Languages Index ranks Python as the 1st and R as the 7th most popular programming languages.

R 3.5.1 has been released.

Industry News

Google introduces Cirq, an open-source framework to simulate noisy intermediate-scale quantum computers.

Hortonworks announces new partnerships with Google, Microsoft and IBM to bring its big data platform to the cloud.

IBM plans to release an annotated dataset of 1 million images, with the goal of reducing bias in facial recognition systems.

Amazon Sagemaker enhances the DeepAR, BlazingText, and Linear Learner algorithms built into the machine learning framework.

A series on Data Ethics by Mike Loukides, Hilary Mason and DJ Patil. The most recent post proposes a checklist for ethical data science practices.

Apple has combined its various artificial intelligence divisions into a new structure led by recent hire John Giannandrea.

Microsoft News

Open data sets curated by Microsoft Research for computer vision, natural language processing and scientific analysis. 

Microsoft acquires Bonsai, a startup focused on using ML to train autonomous machines using simulated environments.

Microsoft President Brad Smith calls for public regulation and corporate responsibility in the use of facial recognition technology.

ML.NET 0.3 released adding ONNX support, LightGBM learners, and more to the .NET machine learning library.

Microsoft Speech Service SDK version 0.5.0 has been released, with support for long-running audio and automatic reconnections.

Azure HDInsight now supports Apache Spark 2.3, improving performance for Python-based interactions.

Azure Databricks now includes RStudio integration for R developers. The integration is described in detail at the Databricks blog.

The Microsoft Face API has significantly improved its ability to recognize gender across skin tones.

Learning resources

Andrey Kurenkov's essay on the limitations of reinforcement learning. A February 2018 blog post by Alex Irpan also provides some cautionary examples.

A Developer's Guide to Building AI Applications, a free e-book that provides a gentle introduction to the Microsoft AI Platform.

A tutorial on adding machine learning to mixed reality applications, using Azure ML Studio.

How to read and write Microsoft SQL Server data in Python using Pandas dataframes.

Object dectection via distributed deep learning on multiple GPUs, using Horovid on Azure Batch AI.

How Artificial Intelligence Could Prevent Natural Disasters, from Wired.

A neural network for colorizing video also finds application in object tracking.

Computer vision with an embedded learning library from Microsoft Research used to monitor endangered Purple Martin nests at Disney Animal World.

A diabetic retinopathy prediction application, based on image analysis with convolutional neural networks and implemented with Azure Machine Learning.

Fashion similarity matching with Siamese Networks.

Find previous editions of the monthly AI roundup here

Announcing availability of Azure Managed Application in AzureGov

$
0
0

Azure Managed Applications enable Managed Service Provider (MSP), Independent Software Vendor (ISV) partners, and enterprise IT teams to deliver fully managed turnkey cloud solutions that can be made available through the enterprise Service Catalog of a specific end-customer. Customers can quickly deploy managed applications in their own subscription and rely on the partner or central IT team for maintenance operations and support across the lifecycle. 

It is the doorway through which enterprises consume Azure.

Organization Service Catalog as a distribution channel

Service Catalog allows organizations to create a catalog of approved applications and services that can be consumed by people in the organization. It can contain anything from customized virtual machine offers, servers, databases to complex in-house applications. Maintaining such a catalog of solutions is helpful especially for central IT teams in enterprises as it enables them to ensure compliance with certain organizational standards while providing great solutions for their organization. They can control, update, and maintain these applications. It allows employees in the organization to easily discover the rich set of applications that are recommended and approved by the IT department. The customers will only see the Service Catalog Managed Applications created by themselves or those that have been shared with them by other people in the organization.

Enterprise can control who gets to publish to the Service Catalog using Azure Role Based access control. This role translates to a Service Catalog Admin. And then there can be a separate role for consumers of Service Catalog.

Publishing

Publishing to the Service Catalog is simple can be performed using Azure Portal, CLI or PowerShell. The main components required are a) the template files, which describe the resources that will be provisioned, and b) the UI definition file, which describes how the required inputs for provisioning these resources will be displayed in the portal. The required files are packaged in a .zip file and uploaded through the Service Catalog blade in portal. Below is the screenshot from the publishing portal. Learn more about how to publish and consume Service Catalog Managed Applications.

image

Pricing

There are no additional fees for partners publishing Managed Applications into customer Service Catalog.

Customers are billed for the consumption of the Azure resources which are part of the Managed Application, using their regular billing construct. For example, if as part of the Managed Application, a virtual machine gets provisioned in the customers subscription, the customer will be charged for the virtual machine usage. Similarly, the fees partners charge customers for lifecycle operations will show as a new line item in customer’s Azure invoice.

Authorizations

The resource group containing the resources which are part of the Managed Application is locked for the customer. The customer has read-only access to the resources in this resource group. As a result, the customer cannot accidently delete or update the resources which are part of the Managed application. The publisher can choose to publish an unlocked Managed Application as well which would then allow the customers to make changes or delete the underlying components.

The publisher of the managed application, however, gets either the required permissions which enables him to maintain, service, and upgrade the application in the customer’s tenant. These permissions are defined by the typical Azure RBAC roles. More details on this can be found in the Additional Resources section.

Publishing at Customer’s Service Catalog

Below is a short summary to help in understanding the key capabilities when publishing to Service Catalog.

  Service Catalog Managed Application
Publishing Tool
  • Azure CLI
  • Azure PowerShell (Create Service->Service Catalog Managed Application Definition)
  • Azure Portal
Consumption Tool
  • Azure Portal (by navigating to More Services->Managed Applications), Azure CLI, Azure PowerShell
Pricing
  • No fees to publish for partners.
  • Customers billed for Azure resources which get provisioned as part of the managed application.
Artifacts needed for package
  • mainTemplate.json
  • createUIDefinition.json
Uses cases
(customers)
  • Easy discoverability of approved IT services and applications
  • Simple acquisition and deployment
  • Abstract the end users from any underlying complexity of Azure resources.
Use cases
(partners)
  • Deliver approved apps/services to developers and business units within an end-customer organization.
  • Abstract the end users from any underlying complexity of Azure resources.
  • Ensure managed apps deployed in customer tenants are free of tampering and unintended changes.
  • Ensuring governance with corporate standards with the service catalog defined application.

Please try out this new service and let us know your feedback through our user voice channel or in the comments below.

Additional resources

A quick tour of AI services in Azure

$
0
0

If you're after a quick overview of some of the services available in Azure to build AI-enabled applications, you might want to check out the 6-minute video below. It provides a tour of three services:

  • Azure Cognitive Services, pre-trained machine learning models to detect and characterize faces, text sentiment, and more 
  • Azure Machine Learning, a development, deployment and management framework for machine learning models you train yourself, and
  • Azure Bot Service, a framework for creating intelligent chat-based interfaces.

Want to learn more? You can find more videos and resources at the Azure Essentials page.

TD Bank empowers employees with assistive technology in Office 365 and Windows 10

$
0
0

Today’s post was written by Bert Floyd, senior IT manager of assistive technologies at TD Bank Group.

My journey with accessible technology at TD started more than 10 years ago, when I was called in to help incorporate a screen reader and Braille display into our retail environment for a new employee who was blind. Back then, it was a steep learning curve for the IT department. That’s not the case today. Over the intervening decade, we have created an inclusive corporate culture that celebrates everyone, including people with disabilities, and provides us with huge business potential. For example, one in seven people* in Canada identifies as having a disability, and there is an increasing incidence of age-related disabilities among our growing elderly demographic. Making sure our services are easily accessible is key to earning the business of this considerable segment of the population.

We are excited about introducing accessible technologies within Microsoft Office 365 and Windows 10 to empower our employees to help us work toward this strategic advantage. We find that most people can benefit from these accessible technologies, whether they identify as having a disability or not, because the technologies are built in to the Office apps. When employees can customize their environment and adapt to a wide variety of situations, they will be far more successful and productive.

And when we accommodate employees who identify as having a disability, we gain their insight and innovation to help us build accessibility right into our products and services. People with disabilities must think creatively about how to do things that other people don’t necessarily have to worry about, and we want to support that creativity in our workplace. We’re deploying Office 365 to all our employees and Windows 10 to almost 100,000 computers, which helps create an accessible workplace and ensure we will not miss out on hiring the best and the brightest.

From our websites to our brick-and-mortar branches and ATMs, we try to consider accessibility in every aspect of the customer experience. And we believe that with a more diverse and inclusive workforce, we’ll be in a better position to get there.

I’m excited about giving our employees the opportunity to leverage the accessibility features in Office 365 and Windows 10 in their everyday work lives. All employees need to think about accessibility, and everyone plays a role in creating a supportive, inclusive culture. When we all use the same inclusive tool set, there is enormous potential for improving productivity and driving awareness about the value of creating accessible documents and presentations for everyone to easily read and understand. Employees at TD already have access to Accessibility Checker, which makes it easy to spot problems and make content in Microsoft Word, Excel, PowerPoint, Outlook, and OneNote more accessible. People are learning about Narrator and Magnifier in Windows 10 and about the built-in color filters.

We have come a long way since hiring that first employee who was blind. Today, approximately 6 percent of our workforce identifies as having a disability. Our Assistive Technologies Lab welcomes anyone to come and learn about inclusive design and the technologies we have available to support our employees. We work with technology projects to help them conform to our IT accessibility standards, and we rolled out a training program for our developers and testers—a number of them with disabilities—to ensure we fully consider accessibility in our customer-facing products and services.

Today, TD prides itself on its diverse and talented workforce, and I’m incredibly lucky to be part of a great team that works hard to put so many resources behind our employees. Along with our assistive technologies, we are using Office 365 and Windows 10 to help us remove barriers for people with disabilities to create a more inclusive workplace that’s as diverse and exciting as the communities we serve.

*A profile of persons with disabilities among Canadians aged 15 years or older, 2012.

—Bert Floyd

Read the case study to learn more about how TD Bank is empowering its employees with assistive technology in Office 365 and Windows 10.

The post TD Bank empowers employees with assistive technology in Office 365 and Windows 10 appeared first on Microsoft 365 Blog.

Build richer applications with the new asynchronous Azure Storage SDK for Java

$
0
0

Cloud scale applications typically require high concurrency to achieve desired performance when accessing remote data. The new Storage Java SDK simplifies building such applications by offering asynchronous operations, eliminating the need to create and manage a large thread-pool. This new SDK uses the RxJava reactive programming model for asynchronous operations, also relying on Netty HTTP client for REST requests. Get started with the Azure Storage SDK for Java now.

Azure Storage SDK v10 for Java adopts the next-generation Storage SDK design providing thread-safe types that were introduced earlier with the Storage Go SDK release. This new SDK is built to effectively move data without any buffering on the client, and provides interfaces close to the ones in the Storage REST APIs. Some of the improvements in the new SDK are:

  • Asynchronous programming model with RxJava
  • Low-level APIs consistent with Storage REST APIs
  • New high-level APIs built for convenience
  • Thread-safe interfaces
  • Consistent versioning across all Storage SDKs

Asynchronous programming model with RxJava

Now that the Storage SDK supports RxJava it is easier to build event driven applications. This is because it allows you to compose sequences together with the observer pattern. The following sample, that uploads a directory of xml files as they are found, depicts this pattern:

// Walk the directory and filter for .xml files
Stream walk = Files.walk(filePath).filter(p -> p.toString().endsWith(".xml"));

// Upload files found asynchronously into Blob storage in 20 concurrent operations
Observable.fromIterable(() -> walk.iterator()).flatMap(path -> {
    BlockBlobURL blobURL = containerURL.createBlockBlobURL(path.getFileName().toString());

    FileChannel fc = FileChannel.open(path);
    return TransferManager.uploadFileToBlockBlob(
        fc, blobURL,
            BlockBlobURL.MAX_PUT_BLOCK_BYTES, null)
        .toObservable()
        .doOnError(throwable -> {
             if (throwable instanceof RestException) {
                 System.out.println("Failed to upload " + path + " with error:" + ((RestException) throwable).response().statusCode());
             } else {
                 System.out.println(throwable.getMessage());
             }
         })
         .doAfterTerminate(() -> {
              System.out.println("Upload of " + path + " completed");
              fc.close();
          });

    }, 20)  // Max concurrency of 20 - this is usually determined based on the number of cores you have in your environment
    .subscribe();

The full sample is located at the Azure Storage Java SDK samples repository.

The sample above calls TransferManager.uploadFileToBlockBlob, a high-level API, as the Observable emits signals, in this case java.nio.file.Path type. By using flatMap, we can configure the maximum concurrent connections, which is set to 20 in this example. If we were to upload these files using the Azure Storage SDK v7, we would have to create threads (up to 20) and manage them, whereas in the example above RxJava manages the threadpool uploading the same data set concurrently in a lot fewer threads, which is more resource efficient.

For more information, read about RxJava and Reactive programming model.

Low-level APIs consistent with storage REST APIs

Low-level APIs exist on the URL types, e.g. BlockBlobURL, and are designed to be simple wrappers around the REST APIs providing convenience but no hidden behavior. Each call to these low-level APIs guarantees exactly one REST request sent (excluding retries). Further, the names on these types have been updated to make their behavior more clear. For example, PutBlob is now Upload, PutBlock is now StageBlock, and PutBlockList is now CommitBlockList, and more.

BlockBlobURL blobURL = containerURL.createBlockBlobURL("mysampledata");
 
String data = "Hello world!";
blobURL.upload(Flowable.just(ByteBuffer.wrap(data.getBytes())), data.length(), null, null, null)
    .subscribe(blockBlobsUploadResponse -> {
        System.out.println("Status code: " + blockBlobsUploadResponse.statusCode());
    }, throwable -> {
        System.out.println("Throwable: " + throwable.getMessage());
    });

New high-level APIs, built for convenience

The TransferManager class is where we provide convenient high-level APIs that internally call the other lower-level APIs. For example, the uploadFileToBlockBlob method can upload a 1GB file by internally making 10 x StageBlock calls (with each block configured as 100MB in size) followed by 1 call to CommitBlockList to atomically commit the uploaded blocks in the Blob service.

Single response = TransferManager.uploadFileToBlockBlob(
        FileChannel.open(filePath), blobURL,
        BlockBlobURL.MAX_PUT_BLOCK_BYTES, null)
        .doOnError(throwable -> {
            if (throwable instanceof RestException) {
                System.out.println("Failed to upload " + filePath + " with error:" + ((RestException) throwable).response().statusCode());
            } else {
                System.out.println(throwable.getMessage());
            }
        })
        .doAfterTerminate(() -> System.out.println());

response.subscribe(commonRestResponse -> {System.out.println(commonRestResponse.statusCode());});

Thread-safe interfaces

Earlier Storage SDKs (version 9 or earlier) offered objects such as CloudBlockBlob and CloudBlobContainer which weren't thread safe and were mutable in such a way that could result in issues during runtime. The new Storage SDKs (v10 or above) provide interfaces closer to the Storage REST APIs and most of the associated objects are immutable allowing them to be shared.

For instance, when you want to perform an operation on a blob (e.g., http://account.blob.core.windows.net/container/myblob), you construct a BlockBlobURL object with the Blob URI and all the associated REST API operations are methods of that type. You can call Upload, StageBlock and CommitBlockList on that URI object. These methods all return a Single (io.reactivex.Single) of RestResponse<THeaders, TBody> that wraps the REST API response, which is immutable. It does not modify the created instance of the BlockBlobURL type.

New Storage SDK versions

The new SDKs follow a new versioning strategy that is tied to the Storage Service REST API version. Version 10, the current release, will be tied to the Storage REST API version 2018-03-28. All the new Storage SDKs across all programming languages will use V10 for REST API 2018-03-28 release so it is easy for you to navigate through different versions. When a new SDK is released supporting the next REST API release, its major version will be bumped (v10 to V11 for instance) regardless of any breaking changes in the client. This is mainly due to possible behavior changes in the Service when moving from one REST API version to another.

Versions earlier than 10 will be reserved for the older Storage SDK design. Any Storage SDK with the version 10 or later will adopt the new SDK design.

Get started now

To get started with the Azure Storage SDK v10 for Java, use the following Blob maven package (File and Queue coming soon).

<dependency>
     <groupid>com.microsoft.azure</groupid>
     <artifactid>azure-storage-blob</artifactid>
     <version>10.0.1-Preview</version>
</dependency>  

Here are a few helpful links to help you get started:

Roadmap

Azure Storage SDK v10 for Java is currently in Preview and supports Blob storage only. We'll be releasing a few updates soon adding more functionality based on user feedback. So please check it out and let us know your feedback on GitHub. Here are a few of the major changes that are scheduled for a release soon:

  • Support for 2018-03-28 coming very soon
  • Support for Queue, File services
  • GA release

Top feature requests added with Azure Blockchain Workbench 1.2.0

$
0
0

We’re excited to see a ton of engagement and positive feedback on Azure Blockchain Workbench since our initial public preview release in May. Last month, we made our first major update to the public preview release based on your feedback and feature requests. Today, we’re releasing our next update to Workbench, which we’re calling version 1.2.0. You can either deploy a new instance of Workbench through the Azure Portal or upgrade your existing deployment to 1.2.0 using our upgrade script.

This update includes the following improvements:

Enable/disable apps

Many of you have started to iterate and create multiple blockchain apps using Workbench. One of the most requested features we’ve heard is the ability to disable unused blockchain apps within the Workbench Web app. With 1.2.0, you will be able to enable or disable applications. In addition, the UI will allow you to filter the list of applications to only show enabled or disabled applications.

image

image

BYOB – Bring Your Own Blockchain

As part of the Workbench deployment, we deploy a set of Ethereum Proof-of Authority (PoA) nodes within a single member’s subscription. This topology works great for situations where it’s OK to have one member manage all the blockchain nodes. With version 1.2.0 customers will have the choice to deploy Workbench and point that deployment to an existing Ethereum PoA network. The topology of the network can be anything you want, including a multi-member blockchain network. The network can exist anywhere, which means the network can be on prem, a hybrid deployment, or even on another cloud. The requirements are:

  • The endpoint must be an Ethereum Proof-of-Authority (PoA) blockchain network. Coming soon will be a new standalone Ethereum PoA solution, which can be used across members within a consortium.
  • The endpoint must be publicly accessible over the network.
  • The PoA blockchain network should be configured to have gas price set to zero (note, Workbench accounts are not funded, so if funds are required, transactions will fail).

The interface that allows connecting to an existing network is shown below.

image

image

Note that we have only tested Workbench against a PoA network and while other protocols may work, we cannot provide support for them at this time. We plan to release a PoA single/multi-member solution template similar to our Proof-of-Work solution template in the Azure Marketplace soon and that will be compatible with BYOB.

New supported datatype – Enum

Workbench supports several basic datatypes, such as strings and integers. With 1.2.0, we now support a the enum datatype. Enums will allow you to create blockchain apps where you can narrow selection for certain fields to a specific list. The Workbench Web app UI will show enums as a dropdown list of single-select options. The enum type is specified via the configuration file as follows:
         {
           "Name": "PropertyType",
           "DisplayName": "Property Type",
           "Description": "The type of the property",
           "Type": {
             "Name": "enum",
             "EnumValues": ["House", "Townhouse", "Condo", "Land"]
           }
         }

This configuration corresponds to the code within the smart contract. For example:

         enum PropertyTypeEnum {House, Townhouse, Condo, Land}
         PropertyTypeEnum public PropertyType;

In the Web app, you will see the following dropdown UI:

image

You can reference how to use the enum type in the Room Thermostat sample application.

Better visibility into Workbench application deployment errors

To make it easier to deploy a working Workbench application, we will show solidity warnings. These warnings don’t necessarily mean you have bugs, which need to be fixed. Instead, warnings indicate potential errors you might encounter when running your application.

image

Improvements and bug fixes

We also made several other improvements to Workbench, including:

  • The Workbench frontend and backend now fully works with more than 100 provisioned users. You will be able to connect to Azure Active Directories with more than 100 users and the Workbench client allows you to search and add users within large directories.
  • Deployment of Workbench is more reliable as we’ve addressed the top failure cases, such as database initialization and naming conflicts. 
  • Performance of the Workbench REST API has been improved by moving App Service plan to Premium V2. The higher end CPU will result in better overall performance. Note, there is no change in cost from moving to this plan.

Please use our Blockchain User Voice to provide feedback and suggest features/ideas for Workbench. Your input is helping make this a great service.  We look forward to hearing from you.

Event Grid June updates: Dead lettering, retry policies, global availability, and more

$
0
0

Since our updates at //Build 2018, the Event Grid team’s primary focus has been delivering updates that will make it easier for you to run your critical workloads on Event Grid. With that in mind and always aiming for a better development experience, today we are announcing the release dead lettering events to Blob Storage, configurable retry policies, availability in all public regions, Azure Container Registry as a publisher, some SDK updates to include all these new features, and portal UX updates! Let’s dig on what all this means to you.

Dead Lettering

Dead lettering is a common pattern in events and messaging architectures that allow you to handle failed events in a specific way. An event delivery may fail because the endpoint receiving the event is continually down, authorization has changed, the event is malformed, or any number of reasons. But obviously this doesn’t mean the event isn’t important and just be thrown away, as every single event carries critical business value. Even when they don’t, it’s useful to track failed events for post-mortems or telemetry.

Dead lettering is now built right into Azure Event Grid, sending those failed events to Blob Storage. Each Event Subscription can have dead lettering enabled, so if delivery of events to its endpoint repeatedly fails over the course of the retry period, the event can be pushed to a Blob Storage container rather than being dropped. And because Azure Blob Storage is able to emit BlobCreated events, you can receive an event anytime something is dead lettered and have your system take action on it.

Take a look at how easy it is to set dead letter location for your events, and start improving the reliability of your solutions today by enabling it on your Event Subscriptions.

Retry Policies

Working in conjunction with dead lettering, you can now specify for how long you want an event to be retried or how quickly you want it to be dead lettered. If you’re building a real-time system that only cares about what happened in the last five minutes and your system goes down, most likely it isn’t either interesting or helpful to have events from a few hours ago delivered when you come back online. Or maybe you know that if your system starts getting overwhelmed by traffic and you reject some events, having them retried is only going to make it worse. You can dead letter those events and pull them from storage when traffic returns to normal.

Retry policies in Azure Event Grid now allow you to specify both a maximum number of retries or a maximum retry time window for each Event Subscription. Once the lesser of those two numbers is exceeded, the event is automatically pushed to the dead letter destination or dropped, depending on what you have configured to happen.
The defaults are 24h (1440min) and 30 times, but you can easily change those settings to your preferred time window and number of retries to make sure you are not missing a single event.

Global Availability

One of the biggest asks we have had over the last few months is to be present wherever your workloads are. I’m incredibly happy to announce that we have been diligently working on deploying to all public regions, and we are now available in the following ones:

  • Australia Central
  • Australia Central 2
  • Australia East
  • Australia Southeast
  • Brazil South
  • Canada Central
  • Canada East
  • Central India
  • Central US
  • East Asia
  • East US
  • East US 2
  • France Central
  • France South
  • Japan East
  • Japan West
  • Korea Central
  • Korea South
  • North Central US
  • North Europe
  • South Central US
  • South India
  • Southeast Asia
  • UK South
  • UK West
  • West Central US
  • West Europe
  • West India
  • West US
  • West US 2

The few remaining regions will become available in the coming months, followed by availability in Government and Sovereign clouds.

Azure Container Registry Events

Azure Container Registry now natively emits events when an image is pushed or when an image is deleted. This allows you to subscribe to these changes and react to them in real time as your container images change, providing a new way to automate your deployment workflows thanks to this integration.

Watch the properties and schema for Container Registry events, and start using them today to save time on your deployment tasks.

SDK and Portal updates

Preview versions of all Event Grid SDKs have been updated to enable all new and preview features in Event Grid, including Retry Policies, Dead Lettering, event delivery schema, input schema mappings, queues as a destination, and hybrid connections integration. Check the latest versions of all SDKs and choose the one that better fits your needs.

The Azure Portal blades for Event Grid have also been updated to make it easier to find your events within the context of a service, as well as more easily manage your event subscriptions. An advanced editor has also been added to the portal to allow you to directly edit the JSON definition of any event subscription directly in the portal for any advanced or new scenarios you have.

We’re incredibly excited about the updates we are releasing today and love all the feedback we have been getting from the community. We want to continue improving updates to Event Grid making it easier to use, and better at supporting your needs and scenarios. Please keep engaging with us by providing your feedback.

Enabling administrators to revoke VSTS access tokens

$
0
0
As promised in the Protecting our users from the ESLint NPM package breach blog post last week, we have deployed new REST APIs to allow administrators of Visual Studio Team Services (VSTS) accounts to centrally revoke Personal Access Tokens (PAT) and JSON Web Tokens (JWT) created by users in their accounts. We’ve reviewed our system telemetry and... Read More

Example Code – Opinionated ContosoUniversity on ASP.NET Core 2.0’s Razor Pages

$
0
0

The best way to learn about code isn't just writing more code - it's reading code! Not all of it will be great code and much of it won't be the way you would do it, but it's a great way to expand your horizons.

In fact, I'd argue that most people aren't reading enough code. Perhaps there's not enough clean code bases to check out and learn from.

I was pleased to stumble on this code base from Jimmy Bogard called Contoso University at https://github.com/jbogard/ContosoUniversityDotNetCore-Pages.

There's a LOT of good stuff to read in this repo so I won't claim to have read it all or as deeply as I could. In fact, there's a good solid day of reading and absorbing here.However, here's some of the things I noticed and that I appreciate. Some of this is very "Jimmy" code, since it was written for and by Jimmy. This is a good thing and not a dig. We all collect patterns and make libraries and develop our own spins on architectural styles. I love that Jimmy collects a bunch of things he's created or contributed to over the years and put it into a nice clear sample for us to read. As Jimmy points out, there's a lot in https://github.com/jbogard/ContosoUniversityDotNetCore-Pages to explore:

Clone and Build just works

A low bar, right? You'd be surprised how often I git clone someone's repository and they haven't tested it elsewhere. Bonus points for a build.ps1 that bootstraps whatever needs to be done. I had .NET Core 2.x on my system already and this build.ps1 got the packages I needed and built the code cleanly.

It's an opinioned project with some opinions. ;) And that's great, because it means I'll learn about techniques and tools that I may not have used before. If someone uses a tool that's not the "defaults" it may me that the defaults are lacking!

  • Build.ps1 is using a build script style taken from PSake, a powershell build automation tool.
  • It's building to a folder called ./artifacts as as convention.
  • Inside build.ps1, it's using Roundhouse, a Database Migration Utility for .NET using sql files and versioning based on source control http://projectroundhouse.org
  • It's set up for Continuous Integration in AppVeyor, a lovely CI/CD system I use myself.
  • It uses the Octo.exe tool from OctopusDeploy to package up the artifacts.

Organized and Easy to Read

I'm finding the code easy to read for the most part. I started at Startup.cs to just get a sense of what middleware is being brought in.

public void ConfigureServices(IServiceCollection services)
{
    services.AddMiniProfiler().AddEntityFramework();
    services.AddDbContext<SchoolContext>(options =>
        options.UseSqlServer(Configuration.GetConnectionString("DefaultConnection")));
    services.AddAutoMapper(typeof(Startup));
    services.AddMediatR(typeof(Startup));
    services.AddHtmlTags(new TagConventions());
    services.AddMvc(opt =>
        {
            opt.Filters.Add(typeof(DbContextTransactionPageFilter));
            opt.Filters.Add(typeof(ValidatorPageFilter));
            opt.ModelBinderProviders.Insert(0, new EntityModelBinderProvider());
        })
        .SetCompatibilityVersion(CompatibilityVersion.Version_2_1)
        .AddFluentValidation(cfg => { cfg.RegisterValidatorsFromAssemblyContaining<Startup>(); });
}
Here I can see what libraries and helpers are being brought in, like AutoMapper, MediatR, and HtmlTags. Then I can go follow up and learn about each one.

MiniProfiler

I've always loved MiniProfiler. It's a hidden gem of .NET and it's been around being awesome forever. I blogged about it back in 2011! It sits in the corner of your web page and gives you REAL actionable details on how your site behaves and what the important perf timings are.

MiniProfiler is the profiler you didn't know you needed

It's even better with EF Core in that it'll show you the generated SQL as well! Again, all inline in your web site as you develop it.

inline SQL in MiniProfiler

Very nice.

Clean Unit Tests

Jimmy is using XUnit and has an IntegrationTestBase here with some stuff I don't understand, like SliceFixture. I'm marking this as something I need to read up on and research. I can't tell if this is the start of a new testing helper library, as it feels too generic and important to be in this sample.

He's using the CQRS "Command Query Responsibility Segregation" pattern. Here starts with a Create command, sends it, then does a Query to confirm the results. It's very clean and he's got a very isolated test.

[Fact]
public async Task Should_get_edit_details()
{
    var cmd = new Create.Command
    {
        FirstMidName = "Joe",
        LastName = "Schmoe",
        EnrollmentDate = DateTime.Today
    };
    var studentId = await SendAsync(cmd);
    var query = new Edit.Query
    {
        Id = studentId
    };
    var result = await SendAsync(query);
    result.FirstMidName.ShouldBe(cmd.FirstMidName);
    result.LastName.ShouldBe(cmd.LastName);
    result.EnrollmentDate.ShouldBe(cmd.EnrollmentDate);
}

FluentValidator

https://fluentvalidation.net is a helper library for creating clear strongly-typed validation rules. Jimmy uses it throughout and it makes for very clean validation code.

public class Validator : AbstractValidator<Command>
{
    public Validator()
    {
        RuleFor(m => m.Name).NotNull().Length(3, 50);
        RuleFor(m => m.Budget).NotNull();
        RuleFor(m => m.StartDate).NotNull();
        RuleFor(m => m.Administrator).NotNull();
    }
}

Useful Extensions

Looking at a project's C# extension methods is a great way to determine what the author feels are gaps in the underlying included functionality. These are useful for returning JSON from Razor Pages!

public static class PageModelExtensions
{
    public static ActionResult RedirectToPageJson<TPage>(this TPage controller, string pageName)
        where TPage : PageModel
    {
        return controller.JsonNet(new
            {
                redirect = controller.Url.Page(pageName)
            }
        );
    }
    public static ContentResult JsonNet(this PageModel controller, object model)
    {
        var serialized = JsonConvert.SerializeObject(model, new JsonSerializerSettings
        {
            ReferenceLoopHandling = ReferenceLoopHandling.Ignore
        });
        return new ContentResult
        {
            Content = serialized,
            ContentType = "application/json"
        };
    }
}

PaginatedList

I've always wondered what to do with helper classes like PaginatedList. Too small for a package, too specific to be built-in? What do you think?

public class PaginatedList<T> : List<T>
{
    public int PageIndex { get; private set; }
    public int TotalPages { get; private set; }
    public PaginatedList(List<T> items, int count, int pageIndex, int pageSize)
    {
        PageIndex = pageIndex;
        TotalPages = (int)Math.Ceiling(count / (double)pageSize);
        this.AddRange(items);
    }
    public bool HasPreviousPage
    {
        get
        {
            return (PageIndex > 1);
        }
    }
    public bool HasNextPage
    {
        get
        {
            return (PageIndex < TotalPages);
        }
    }
    public static async Task<PaginatedList<T>> CreateAsync(IQueryable<T> source, int pageIndex, int pageSize)
    {
        var count = await source.CountAsync();
        var items = await source.Skip((pageIndex - 1) * pageSize).Take(pageSize).ToListAsync();
        return new PaginatedList<T>(items, count, pageIndex, pageSize);
    }
}

I'm still reading all the source I can. Absorbing what resonates with me, considering what I don't know or understand and creating a queue of topics to read about. I'd encourage you to do the same! Thanks Jimmy for writing this large sample and for giving us some code to read and learn from!


Sponsor: Scale your Python for big data & big science with Intel® Distribution for Python. Near-native code speed. Use with NumPy, SciPy & scikit-learn. Get it Today!



© 2018 Scott Hanselman. All rights reserved.
     

Visual Studio IntelliCode expands AI-assisted coding to Python in Visual Studio Code

$
0
0

Today at the EuroPython conference in Edinburgh, Scotland, we are introducing support for Python in the IntelliCode extension for Visual Studio Code, as well as new improvements in editing for Python developers in Visual Studio Code.

AI-assisted IntelliSense for Python

At Microsoft Build 2018, we first announced IntelliCode, a set of AI-assisted capabilities to improve developer productivity. We shipped a preview extension showing the first of these capabilities: presenting the most relevant completions as you type in Visual Studio. Today, we are bringing the power of machine learning trained on over 2,000 open-source repos to Python developers using Visual Studio Code. Check out the video below to see IntelliCode for Python in action.

When writing Python code, the IntelliSense code completion speeds up your productivity by providing suggestions as you type. However, there can be hundreds of completions to scroll through, and you typically need to type the first few characters out before you can get close to the right result. Using our machine learning algorithm, IntelliCode can infer the most relevant completions based on the current code context.

In the below example of writing TensorFlow code, we can see the starred items in the completion list are provided by IntelliCode, allowing you to select from the top items rather than having to search through the completion list:

Example of writing TensorFlow code showing starred items in completion list provided by IntelliCode, without having to search through completion list

Notice how in each case we used the “tf.” variable but the suggestions change to present the most relevant options for the code context in which the tf variable is being used.

These suggestions work on a broad variety of code included in our 2,000+ repo training set: machine learning frameworks, web frameworks, to general purpose scripting. Simply install both the IntelliCode extension and the Python extension, and IntelliCode will download the latest machine learning model and start making suggestions.

Microsoft Python Language Server (Preview)

For Python, IntelliCode builds on top of our Microsoft Python Language Server: our fast and powerful language server which was introduced in the July 2018 release of the Python extension for Visual Studio Code.

IntelliCode will prompt you to enable the Microsoft Python Language server, so you’ll also notice that you get enhanced IntelliSense with syntax errors and basic warnings will now appear as you type. IntelliSense, combined with the new Python Language Server, is also richer: the language server uses open-source typeshed definitions to provide completions for types where they can’t be statically inferred.

The combination of both the new Python extension features and IntelliCode will make you more productive when writing everyday Python code.

More Improvements for Python in Visual Studio Code

We have been continuing to add improvements for Python developers in Visual Studio Code. At Build 2018, we announced the public preview of Visual Studio Live Share including support for Python, allowing you to collaborate on code with other co-workers regardless of whether they are using the Visual Studio IDE or Visual Studio Code for their Python coding.

For everything else Python at Microsoft, be sure to check out our Python at Microsoft blog.

Get Involved

As we expand IntelliCode’s capabilities to more scenarios and other languages, we’ll announce a limited private preview of new features.

Sign up to become an insider to keep up with the project and join the waitlist for the private preview.

Dan Taylor, Senior Program Manager, Python Developer Tools

Dan Taylor is the Program Manager for Python developer tools at Microsoft. He has been at Microsoft for 7 years and has previously worked on performance improvements to .NET and Visual Studio, as well as profiling and diagnostic tools in Visual Studio and Azure.

 

Viewing all 10804 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>