Quantcast
Channel: Category Name
Viewing all 10804 articles
Browse latest View live

Welcome our newest family member – Data Box Disk

$
0
0

Last year at Ignite, I talked to you about the preview of Azure Data Box, a ruggedized, portable, and simple way to move large datasets into Azure. So far, the response has been phenomenal. Customers have used Data Box to move petabytes of data into Azure.

While our customers and partners love Data Box, they told us that they also wanted a lower capacity, even easier-to-use option. They cited examples such as moving data from Remote/Office Branch Offices (ROBOs), which have smaller data sets and minimal on-site tech support. They said they needed an option for recurring, incremental transfers for ongoing backups and archives. And they said it needed to have the same traits as Data Box – namely fast, simple, and secure.

Got it. We hear the message loud and clear. So, I’m here today with our partners at Inspire 2018 to announce a new addition to the Data Box family: Azure Data Box Disk.

Data-Box-Disk-copy-3image

How it works

Data Box Disk leverages the same infrastructure and management experience as Azure Data Box. You can receive up to five 8TB disks, totaling 40TB per order. Data Box Disk is fast, utilizing SSD technology, and is shipped overnight, so you can complete a data transfer job in as little as one week.

The disks connect via USB or SATA, and simple commands such as robocopy or drag-and-drop can be used to move data. Quick and easy. Once returned to the Azure DC, your data is securely uploaded, and the disks are cryptographically erased. Data Box Disk uses AES 128-bit encryption, so your data is safe at every point in the process -- just like Data Box.

Updated-Data-Box-SKU Data-Box-Disk-SKU

A cool success story

Our customers and partners constantly amaze me with the creative ways they use our products, and Data Box Disk is no exception. For example, it turns out that disks are a great form-factor for autonomous vehicle research: the small form factor of disks provide the right balance between capacity and portability to collect and transport test vehicle data into Azure.

One such customer is LG, which is using Data Box Disk at an autonomous vehicle test center in South Korea.

“We needed a way to transfer massive amounts of data for our autonomous vehicle projects, which are based all around the world. The solution needed to be portable, simple to use, cost-effective and, of course, very secure. The Azure Data Box Disk met all of those criteria. Overall it was a very good experience, beginning with Microsoft’s fast response to our business requirement and including its continuing engineering support along the way'”. – Hyoyuel (Andy) Kim, Senior Manager, Vehicle Component Company, LG Electronics

Sign up today for previews

Data Box Disk Preview is available in the EU and US, and we’re continuing to expand to other Azure regions in the coming months. The Preview is currently free; look for more information on pricing later this year.  Customers and CSP (Cloud Solution Provider) Partners are invited to sign up for the Data Box Preview on the same portal as Data Box Preview, which is still available half price - now also in the EU and UK! ISV Partners, please sign up on our Partner Portal.

Signup Button

Stop by and say hello!

If you’re at Inspire, please stop by our booth, we’d love to show you the Data Box family. If you can’t make it this year, please do check out the Azure site or contact your Microsoft rep for more info. Stay tuned, as we’ve got even more cool stuff coming out in the coming months!

Your feedback is important to us. Leave your comments below and let us know what you think about today’s update to the Azure Data Box Family and what we can do to keep improving our service!


New Azure innovation advances customer success for the cloud- and AI-powered future

$
0
0

Organizations around the world are gearing up for a future powered by the intelligent cloud and AI. As these technologies become increasingly central to business strategy and transformation, Microsoft is committed to delivering cutting-edge innovations, programs and expertise that help our customers navigate these technological and business shifts.

At its core, Microsoft is a platform company. That’s why we’re constantly expanding our global reach and adding new innovation to Azure – so all of our customers, everywhere, can succeed. We rely on our vast partner network to help us deliver end-to-end value to organizations of all sizes, all around the world.

I always come back to Microsoft’s mission when preparing for our signature events, and with Microsoft Inspire starting in a few days, I am reminded how powerful it is to work at a company that is rooted in others’ successes. We’ve made it our mission to empower every person and every organization on the planet to achieve more. It’s a bold ambition and one I take seriously. Together with our partners, we’re working to ensure our customers are successful in the midst of rapid technological innovation.

Today, we’re introducing new offers and technologies that will help organizations navigate their own digital transformation and build the critical foundations for cloud and AI. We’re also showcasing momentum from businesses and partners who are investing in intelligent tech from the cloud to the edge.

Expanding the Azure Data Box offering: Last year we introduced Azure Data Box, a secure, easy to manage appliance to help organizations overcome data transfer barriers that can block productivity and slow innovation. The response was overwhelmingly positive and we’re expanding availability of the Azure Data Box preview to new regions including Europe and the United Kingdom. Organizations can use the Data Box with partner solutions from CommVault, Veeam and others.  The Azure Data Box family of products is also expanding to offer our customers even more options to get their data to Azure with the introduction of Azure Data Box Disk. The Data Box Disk is an SSD-disk based option to move data, no matter where it resides, into Azure with ease. It’s ideal for a recurring or one-time data migration of up to 35 TBs and especially well-suited for data transfer from multiple remote branches or offices – and you can sign up for the preview today.

Azure Global Network: Microsoft has built one of the largest cloud networks in the world which provides customers reliable and fast connectivity to our cloud services. We continue to expand our unprecedented network infrastructure and its intelligent capabilities so customers can adapt their networks to a cloud-based model. Aligning with our mission of providing a rich set of networking capabilities that provide unparalleled security, connectivity and interoperability with partner solutions, today we’re introducing two new services, Azure Virtual WAN (wide-area network) and Azure Firewall both now available in preview.

Azure Virtual WAN is a networking service providing optimized and automated branch to branch connectivity that makes it easy to connect to and through Azure. Virtual WAN allows customers to seamlessly connect their branches to each other and Azure using last mile Internet enabling a distributed connectivity model. It also enables customers to construct a hub and spoke network in Azure to more easily route traffic to virtual appliances such as firewalls and Azure network security services. Azure Virtual WAN provides mechanisms to connect traditional customer routers on-premises as well as an expanding ecosystem of new Software-Defined WAN (SD-WAN) systems from Microsoft partners.

Azure Firewall is a cloud native network security service that protects your Azure Virtual Network resources. It is a fully stateful firewall as a service with built-in high availability and unrestricted cloud scalability. Customers can centrally create, enforce, and log application and network connectivity policies, spanning Fully Qualified Domain Names (FQDNs), IP Addresses, ports and protocols across subscriptions and virtual networks. Azure Firewall policies can be fully integrated with customers’ dev ops model while enabling them to manage security risks and achieve compliance requirements for managing resource access and protection in the cloud.

Next-generation SQL Data Warehouse: We’re continuing to invest in delivering the best platform for big data and analytics. Starting today, Azure SQL Data Warehouse customers will enjoy at least 2x faster query performance from their workloads, cementing it as the fastest data warehouse in the cloud. This significant performance improvement is made real by new instant data movement capabilities that allows for extremely efficient movement between data warehouse compute nodes. At the heart of every distributed database system is the need to align two or more tables that are partitioned on a different key to produce a final or intermediate result set. The accelerated data movement results in faster query performance for customers. Azure SQL Data warehouse not only enables lightning analytics on data, but also ensures that the insights are accessible across an organization. We have enhanced the service to support 128 concurrent queries, so that more users can query the same database and not get blocked behind other requests. Azure SQL Data Warehouse delivers these query performance and query concurrency gains without any price increase and still offering the ability to elastically scale as well as pause and resume workloads across thirty-three regions worldwide.

Power BI expands big data prep, introduces new enterprise capabilities: Power BI is empowering business analysts by expanding self-service prep for big data, and enabling customers to unify modern and enterprise BI on one platform. New preview features beginning to appear in July include the ability to use the familiar Power Query experience to ingest, transform, integrate and enrich big data directly in the Power BI web service. The ingested data can be shared across multiple Power BI models, reports and dashboards. Similarly, Power BI will also support the common data model, which gives organizations the ability to simplify how they enrich their data with other sources from Microsoft and third parties, and accelerate analysis across a broad, unified dataset. In addition, advanced capabilities from SQL Server Analysis Services are coming into Power BI, enabling incremental refresh, higher dataset size limits, and aggregates to allow customers to reach large dataset sizes, while maintaining fast and fluid reporting. Also, SQL Server Reporting Services technology will be part of Power BI, creating a unified, secure, enterprise-wide reporting platform accessible to any user across devices.

Windows Server and SQL Server offer: Going beyond this great product innovation, we’re taking steps to ensure companies can first move to the cloud with flexibility. Windows Server and SQL Server 2008/2008 R2 are some of the most popular versions of Microsoft’s legacy products, and End of Support is coming soon. We’re helping our customers navigate through this with an option to migrate these workloads to Azure and receive extra critical security updates – at no additional charge. Of course, we understand many of our customers have a need to keep some workloads on premises, so they can work with us or our partners to upgrade their on-premises versions as well. We want to ensure our customers stay compliant and secure, and still advance their innovation with cloud. This extended offer offers a huge opportunity for our partners to help customers make the move to Azure or upgrade.

And to make migrating to the cloud even easier for customers, Azure DB Managed Instance will be generally available in early Q4 of this calendar year. Currently in preview, Database Managed Instance is a new flavor of Azure SQL Database that represents fully managed on-prem SQL Server Instance hosted in Azure cloud.

New IoT and edge capabilities and programs to power partner and customer innovation:

This spring, we announced we are investing $5B over the next four years in IoT to enable partner and customer innovation through research and development, new programs and offerings. Today we are continuing the momentum, announcing new capabilities and programs that empower our partners and customers to create connected solutions, from the factory floor to smart buildings and cities. Azure IoT Central, the industry’s first true SaaS IoT solution, is adding support for Power BI and Microsoft Flow – enabling customers to visualize real-time intelligence and create workflows to take action based on these insights. These new capabilities will help customers and partners reduce costly scheduled maintenance by proactively detecting and resolving issues remotely. Azure IoT Central also now supports Cloud Solution Provider (CSP) subscriptions, so CSP partners can easily provision and manage Azure IoT Central applications. We recently announced the general availability of Azure IoT Edge, one of the most enterprise-ready and open edge platforms on the market today. Building on this news, we are introducing two new programs to enrich the Azure IoT Edge ecosystem:

  • Azure Certified for IoT – We are expanding the current Certified for IoT program to certify core edge functionalities (in addition to third-party devices), such as security, device management and edge analytics. You can browse existing devices through the Azure IoT device catalog. Hardware partners can learn more about how to certify devices.
  • Azure Marketplace – Browse pre-built first and third-party IoT Edge software modules now available through Azure Marketplace. Software partners can share and showcase their edge modules on the marketplace today and monetize offerings through the platform in the future.

New improvement for Azure mobile developers: Today’s developers need to build and distribute their iOS and Android apps quickly and ensure new apps reach their users fast and more frequently. Visual Studio App Center now enables mobile developers to effectively and efficiently distribute apps instantly with the ability to visualize nuanced distribution metrics, and mark the release as mandatory for testers to update to the latest version. We’ve also redesigned the tester onboarding process to make it easier for developers to ship apps to their users.

Azure Stack and partners provide a premier hybrid solution: Azure Stack lets customers deliver Azure services from an organization’s data center – or even in completely disconnected environments, like an oil rig. It’s a truly partner-centric product. The Azure Stack ecosystem includes a vast network of hardware partners for integrated systems or managed service providers that provide a fully managed Azure Stack experience. Since product launch, there’s been consistent innovation and expanded offerings from the Azure Stack team, including new hardware partners and customer edge solutions. Last week, KPN, a telecommunications company in the Netherlands, announced a blockchain solution with their own customers, all powered by Azure Stack.

We’re well into the era of digital transformation, and we’re excited to have a network of trusted partners to help organizations around the world achieve success – whether they are taking the first steps to migrate to Azure, creating new business value with our IoT and edge solutions, or deriving incredible new insights with our AI technologies. The opportunity to deliver impact across industries, both globally and locally, is here, and Microsoft is committed to being your trusted partner in innovation. As always, we love to hear your feedback or questions.

Latest updates to Azure Database for PostgreSQL

$
0
0

Earlier this year, in March, we announced the general availability (GA) of Azure Database for PostgreSQL, offering the community version of MySQL together with built-in high availability, a 99.99 percent availability SLA, elastic scaling for performance, and industry leading security and compliance in Azure. Since GA, the team has been at work delivering a variety of new features and additional functionality to enhance and extend the value of  this service. I am pleased to share with you some details new additions to this service based on customer feedback.

Enhanced security with virtual network service endpoints support in preview

The virtual network service endpoints feature for Azure Database for PostgreSQL is now available in Public Preview for all regions in which the service is available.

Use virtual network service endpoints to isolate connectivity to a logical server from only a subnet or a set of subnets within a virtual network. The traffic to Azure Database for PostgreSQL from your virtual network always stays within the Azure backbone network, preferring this direct route over any specific routes that take internet traffic through virtual appliances or on-premises. There's no additional billing for virtual network access through service endpoints. Learn more about virtual network service endpoints and how to configure them.

Support for large databases and increased IOPS

Customers can now create 4 TB servers or update existing server storage to 4 TB in Azure Database for PostgreSQL when they select the General Purpose or Memory Optimized service tiers.  With increased storage, customers also leverage up to 6000 IOPS.

Disaster recovery across regions made easier with geo redundant backups and recovery

Geo Restore is now generally available. Azure Database for PostgreSQL servers are backed up periodically to enable Restore features. Using this feature, you may restore the server and all its databases to an earlier point-in-time, on a new server, making disaster recovery across regions even easier.

Support for PostgreSQL version 10.3

The latest update to PostgreSQL version 10, version 10.3, is now generally available in Azure Database for PostgreSQL. Features in this version include improved query parallelism and declarative table partitioning. To start using this version, create a new Azure Database for PostgreSQL server, and then select version 10.

Migration support with Azure Database Migration Service (DMS) preview

Customers can now migrate PostgreSQL databases to Azure Database for PostgreSQL with minimal downtime by using the Azure Database Migration Service (DMS). Use the Azure CLI to provision an instance of DMS to perform migrations from PostgreSQL on-premises or on virtual machines to Azure Database for PostgreSQL.

Memory optimized pricing tier generally available

Customers can now create and switch to the new memory optimized pricing tier, which is designed for high-performance database workloads that require in-memory performance for faster transaction processing and higher concurrency.

With boundary scaling now available, customers can also scale across SKUs (General Purpose and Memory Optimized).

Regional availability increased to 28 countries/regions worldwide

General availability of Azure Database MySQL has been extended to the following countries/regions: Central US (Gen 4), North Central US (Gen 5), France Central (Gen 5), East Asia (Gen 5), Australia East (Gen5), Australia East 2 (Gen5), Australia Southeast (Gen 5), Central India (Gen 5), West India (Gen 5), South India (Gen 5), and Korea Central (Gen 5).

Also, in addition to previously available Gen 4 compute, Gen 5 compute is now generally available in the following regions: Brazil South, East US 2, Japan East, Japan West, South Central US, and Southeast Asia.

For a complete list of the 28 available countries/regions, see the article Azure Database for PostgreSQL pricing tiers.

Additional resources for Azure Database for PostgreSQL

If you are currently this database service, you might find the following links useful:

I hope that you enjoy working with the latest features and functionality available in our Azure Database for PostgreSQL. Be sure to share your impressions via User Voice for PostgreSQL or @Ask Azure DB for PostgreSQL.

Announcing public preview of Azure Virtual WAN and Azure Firewall

$
0
0

Networking trends such as SDWAN (Software-Defined Wide Area Network) can improve performance by using path selection polices at the branch offices to send Internet-bound traffic directly to the cloud eliminating the backhaul to select breakout points. This traffic can quickly reach Microsoft’s global backbone network with intelligent routing to provide the best network experience.  However, having all branches directly accessing the Internet introduces new challenges such as managing branch connectivity and uniformly enforcing network and security polices at scale. Further complicating network policy management across all the branch offices is the trend of more employees working remotely with ever stricter security, privacy, and compliance requirements polices that vary by country/region.

Network security plays an important role in protecting users, data and applications. Cloud developers and IT teams struggle to stay ahead of security attacks. Cloud native network security solutions better fit the modern dev ops model of building and deploying applications while taking advantage of the economic and scale benefits of the cloud. Customers need turnkey solutions that are easy to deploy, use, and manage that offer high availability and  automatically scale.

To help customers with these massive modernization efforts, we are announcing Azure Virtual WAN to simplify large-scale branch connectivity, and Azure Firewall to enforce your network security polices while taking advantage of the scale and simplicity provided by the cloud. 


Azure Virtual WAN

The new Azure Virtual WAN service provides optimized, automated and global scale branch connectivity. Virtual WAN brings the ability to seamlessly connect your branches to Azure with SDWAN & VPN devices (i.e. Customer Premises Equipment or CPE) with built in ease of use and automated connectivity and configuration management.

Figure 1

Figure 1: Connect SDWAN and VPN devices to Hubs that comprise an Azure Virtual WAN

Virtual WAN provides a better networking experience by taking advantage of Microsoft’s global network. Traffic from your branches enters Microsoft’s network at the Microsoft edge site closest to a given branch office. We have over 130 edge sites or Points of Presence (PoPs). Once your traffic is in the Microsoft global network, it terminates in a virtual hub. An Azure Virtual WAN is composed of multiple virtual hubs. You can create your hubs in different Azure regions. Azure has more global regions than any other public cloud provider bringing your virtual hubs close to your branches around the world.

Here is a simple example with a virtual hub in West Europe (Netherlands) and another in North Europe (Ireland). These two hubs are part of a customer’s Azure Virtual WAN. Branch offices connect to the closest virtual hub for the very best performance.

Azure Virtual WAN

We are launching Azure Virtual WAN Preview with Citrix and Riverbed providing a fully automated branch connectivity experience. Our continued commitment to customers is to create more options with a new and fast-growing SDWAN and VPN partner ecosystem.  Solutions from additional partners such as Checkpoint, Nokia Nuage, Palo Alto and Silverpeak will be available in the coming months. I encourage you to join the preview and provide feedback on both service functionality, performance as well as ecosystem and partners.

Partners

We have been working closely with customers dealing with the challenges of branch connectivity at a global scale. Here is a perspective from Sword Group:

“As a multi-national company, Sword needs to connect all around the world and provide interconnectivity. With Riverbed’s SteelConnect and Microsoft’s Azure Virtual WAN, we are able to deploy locations in minutes easily in a few steps, control the access, and remove complex and expensive scenarios – all while providing cloud workloads close to the users. This capability will allow Sword to engage in new markets with great speed and agility.”

SwordGuillaume Mottard, COO, Sword Group, Switzerland



Public preview capabilities

  • Virtual WAN and virtual hubs: You can create a virtual WAN and then deploy virtual hubs in any Azure public region. This allows your hubs to be close to your branch offices. The hubs are where network traffic initially terminates before heading to another branch office or an Azure Virtual Network (VNet).
  • Connectivity automation: It is difficult to manually establish and manage a large number of VPN tunnels. Azure Virtual WAN brings together your preferred CPE be it SD-WAN controller or VPN device to automate the branch provisioning, configuration management and connectivity setup enabling you to easily deploy and manage your Virtual WAN. 
  • Automated VNet configuration: The automated VNet configuration allows you to easily connect your VNet to your hub so users in a branch office can access their Azure resources. 
  • Troubleshooting and monitoring: The platform monitors your on-premises connections providing a unified experience to manage your Virtual WAN along with your Azure resources.

For details on enrolling in the public preview, please visit Azure Virtual WAN


Azure Firewall

The new Azure Firewall service offers fully stateful native firewall capabilities for Virtual Network resources, with built-in high availability and the ability to scale automatically.  Customers can create and enforce connectivity policies using application and network level filtering rules. Connectivity policies can be enforced across multiple subscriptions and virtual networks. The Azure Firewall service is fully integrated with the Azure platform, portal UI and services.

Connectivity policies

"As a bank in a regulated context, we need to master all aspects of security in public cloud, including network and application level perimeter security for data exfiltration prevention. Having another solution in Azure completing the existing ones (like NSG, UDR), will allow outgoing flow filtering at L7, significantly increasing our security. Simple to setup, easy to check logs, and an interesting roadmap, Azure Firewall is very promising!”

Societe GeneraleVictor Martins, IT Architect, Societe Generale


Public preview capabilities

  • Outbound FQDN filtering: Keep data within your infrastructure and prevent outbound Internet traffic and data exfiltration by limiting outbound HTTP/S traffic to a customer specified list of Fully Qualified Domain Names (FQDN).
  • Network traffic filtering rules: Gain visibility and increase control across multiple subscriptions by centrally creating, enforcing and managing your stateful filtering rules by source and destination address, port and protocol.

Edit application rule collection

  • Outbound SNAT support: Enable outside communication from other security devices and appliances using Source Network Address Translation (SNAT). SNAT support provides address translation between your VNet and Public IP, while easily integrating with existing security perimeter and sharing of policies.
  • Azure Monitor logging: All events are integrated with Azure Monitor, giving you a single shared interface for your logging and analytics needs. The integration secures logging of all blocked/accepted incidents and further allows you to both archive logs to an Azure storage account, stream events to your Event Hub, or send them to Log Analytics for additional insights.

Log search


Azure Firewall – A perfect fit with your existing security

Azure Firewall has been built to enhance and strengthen your current Azure security posture, seamlessly complimenting existing  Azure security services.

  • Network Security Group (NSG) and Azure Firewall are complementary, and together provide   better defense and in-depth network security. NSGs provides distributed network layer traffic filtering to limit traffic to resources within virtual networks. Azure Firewall is a fully stateful centralized network firewall as-a-service, providing network and application level protection across virtual networks.
  • Application Gateway WAF provides centralized inbound protection for web applications (L7). Azure Firewall provides outbound network level protection(L3-L4) for all ports and protocols and application level protection (L7) for outbound HTTP/S.
  • Azure DDoS Protection leverages the scale and elasticity of Microsoft’s global network to bring massive DDoS mitigation capacity in every Azure region. Microsoft’s DDoS Protection service protects your Azure applications by scrubbing traffic at the Azure network edge before it can impact your service's availability.
  • Service Endpoints: For secure access to PaaS services, we recommend Service Endpoints that extend your virtual network private address space and the identity of your VNet to the Azure service. Azure Firewall customers can choose to enable service endpoints in the Azure Firewall subnet and disable it on the connected spoke VNETs therefore benefitting from both features – service endpoint security and central logging for all traffic.
  • Network Virtual Appliances: Customers can have a mix of 3rd party NVAs and Azure Firewalls. We are working with our partners on multiple better together scenarios.

More details are in the Azure Firewall product page.

With the addition of Virtual WAN and Firewall to our broad portfolio of network services, we are again expanding what is possible with Azure. They both provide a strong testament to our goal of integrating broadly with the platform and your infrastructure, while at the same time being simple and easy to deploy and use.


Summary

Our mission in Azure Networking is to help you build a secure, high-performant, and reliable global network in the cloud for your mission critical services. We will continue to provide more complete network services. We are interested in your feedback on our new Virtual WAN and Firewall offerings which you can use today. We are also interested in your feedback on our growing catalog of easy to use and fully integrated Azure Networking services.  Please stay tuned as we prepare for even more announcements in the next few months.

Upcoming webinar and additional resources

I will be hosting a webinar on July 18, 2018 talking about and demoing the new Azure Virtual WAN and Azure Firewall. You will also hear from our partners and customers about these new services.

Latest updates to Azure Database for MySQL

$
0
0

Earlier this year, in March, we announced the general availability (GA) of Azure Database for MySQL, offering the community version of MySQL together with built-in high availability, a 99.99 percent availability SLA, elastic scaling for performance, and industry leading security and compliance in Azure. Since GA, the team has been at work delivering a variety of new features and additional functionality to enhance and extend the value of this service. I am pleased to share with you some details about the new additions to this service based on customer feedback.

Enhanced security with Virtual network service endpoints support in preview

The virtual network service endpoints feature for Azure Database for MySQL is now available in Public Preview for all regions in which the service is available.

Use virtual network service endpoints to isolate connectivity to a logical server from only a subnet or a set of subnets within a virtual network. The traffic to Azure Database for MySQL from your virtual network always stays within the Azure backbone network, preferring this direct route over any specific routes that take internet traffic through virtual appliances or on-premises. There's no additional billing for virtual network access through service endpoints. Learn more about virtual network service endpoints and how to configure them.

Support for large databases

Customers can now create 4 TB servers or update existing server storage to 4 TB in Azure Database for MySQL when they select the General Purpose or Memory Optimized service tiers. With increased storage customers, also leverage up to 6000 IOPS.

Data-in Replication generally available

Azure Database for MySQL now supports Data-in Replication. Use this feature to synchronize data from a MySQL server running on-premises, in virtual machines, or database services outside Azure into Azure Database for MySQL. Data-in Replication was designed for scenarios such as:

  • Hybrid data synchronization: Sync data between on-premises servers and Azure Database for MySQL.
  • Multi-cloud synchronization: Sync data between different cloud providers (including VMs and other database services) and Azure Database for MySQL.

Learn more about data-in replication and how to configure it.

Disaster recovery across regions made easier with geo redundant backups and recovery

Geo Restore is now generally available. Azure Database for MySQL servers are backed up periodically to enable Restore features. Using this feature, you may restore the server and all its databases to an earlier point-in-time, on a new server, making disaster recovery across regions even easier.

Migration support with Azure Database Migration Service (DMS) preview

Customers can now migrate MySQL databases to Azure Database for MySQL with minimal downtime by using the Azure Database Migration Service (DMS). Use the Azure CLI to provision an instance of DMS to perform migrations from MySQL on-premises or on virtual machines to Azure Database for MySQL.

Memory optimized pricing tier generally available

Customers can now create and switch to the new memory optimized pricing tier, which is designed for high-performance database workloads that require in-memory performance for faster transaction processing and higher concurrency.

With boundary scaling now available, customers can also scale across SKUs (General Purpose and Memory Optimized).

Regional availability increased to 28 countries/regions worldwide

General availability of Azure Database MySQL has been extended to the following countries/regions: Central US (Gen 4), North Central US (Gen 5), France Central (Gen 5), East Asia (Gen 5), Australia East (Gen5), Australia East 2 (Gen5), Australia Southeast (Gen 5), Central India (Gen 5), West India (Gen 5), South India (Gen 5), and Korea Central (Gen 5).

Also, in addition to previously available Gen 4 compute, Gen 5 compute is now generally available in the following regions: Brazil South, East US 2, Japan East, Japan West, South Central US, and Southeast Asia.

For a complete list of the 30 available regions, see the article Azure Database for MySQL pricing tiers.

Additional resources for Azure Database for MySQL

If you are currently this database service, you might find the following links useful:

I hope that you enjoy working with the latest features and functionality available in our Azure Database for MySQL. Be sure to share your impressions via User Voice for MySQL or @Ask Azure DB for MySQL.

Azure NetApp files now in public preview

$
0
0

Enterprises are rapidly embracing public cloud to take advantage of the scale, agility and economic benefits. However, deploying file-based applications to the cloud has been challenging due to the limited availability of options for highly performant and scalable cloud file services that meet enterprise-grade requirements.
 
Microsoft has teamed up with NetApp to enable enterprises to overcome the challenge of deploying file-based workloads to the cloud. Today we're pleased to announce the public preview of Azure NetApp Files, an Azure native service powered by NetApp's leading ONTAP technology, which has been long trusted by enterprises to meet their requirements for performance, scalability, data management, security, and hybrid capability.

Over the next few months of the preview, you can expect:

  • Rich data management: Azure NetApp Files will support multiple protocols, protocol versions, and performance tiers, as well as offer built-in policy-based snapshot functionality, reducing the need for separate data protection solutions.
  • Security: All data in volumes is automatically encrypted at rest and will benefit from Azure’s industry-leading compliance portfolio.
  • Ease of use: Azure NetApp Files is a fully integrated and managed Azure service, accessible through the Azure portal and manageable through Azure SDKs and command-line tools. It eliminates the provisioning and management overhead that is required for on-premises storage and empowers users without storage expertise to leverage NetApp’s powerful ONTAP technology

Getting started

To join the preview waitlist, visit the Azure NetApp Files Public Preview signup page. From the preview invitation, you can find a link to the preview portal to access the service (Figure 1) and within a few clicks, you'll be able to create your first Azure NetApp Files NFS v3 volume (Figure 2)!

image Figure 1: Azure NetApp Files available in Azure portal

image
Figure 2: An NFS v3 volume listed in an Azure NetApp account.

Azure NetApp Files is available in East US and will soon be offered in West US 2. As with other previews, the service should not be used for production workloads until it reaches general availability (GA).

Pricing

During preview, Azure NetApp Files will be offered at a reduced price. Preview pricing information will be provided in the preview invitation.

Get it, use it and tell us about it

We look forward to hearing your feedback on the service! You can email us at feedback ANFFeedback@microsoft.com. Lastly , we love to hear all of your ideas and suggestions about Azure Storage, which you can post at Azure Storage feedback forum.

CAMS Software Reduces Costs and Increases Efficiency By Using the Bing Maps Distance Matrix API

$
0
0

Based in British Columbia, CAMS Software is a small transportation management software company with big customers. The company’s flagship transportation management solution, Prospero, is used by the majority of major grocery chains in North America, including industry giants.

CAMS Software switched from its old maps provider to Bing Maps to simplify customer interactions, reduce operating costs, and gain efficiency and a competitive edge. By enhancing Prospero with Bing Maps, CAMS Software improved its customers’ experience, giving them faster service, more accurate data, straightforward licensing, and lower costs.

Read the full story on the Microsoft Customer Stories website.

Announcing TypeScript 3.0 RC

$
0
0
TypeScript 3.0, our next release of the type system, compiler, and language service, is fast-approaching! Today we’re excited to announce the Release Candidate of TypeScript 3.0! We’re looking to get any and all feedback from this RC to successfully ship TypeScript 3.0 proper, so if you’d like to give it a shot now, you can get the RC through NuGet, or use npm with the following command:
npm install -g typescript@rc

You can also get editor support by

While Visual Studio 2015 doesn’t currently have an RC installer, TypeScript 3.0 will be available for Visual Studio 2015 users.

While you can read about everything on our Roadmap, we’ll be discussing a few major items going into TypeScript 3.0.

Without further ado, let’s jump into some highlights of the Release Candidate!

Project references

It’s fairly common to have several different build steps for a library or application. Maybe your codebase has a src and a test directory. Maybe you have your front-end code in a folder called client, your Node.js back-end code in a folder called server, each which imports code from a shared folder. And maybe you use what’s called a “monorepo” and have many many projects which depend on each other in non-trivial ways.

One of the biggest features that we’ve worked on for TypeScript 3.0 is called “project references”, and it aims to make working with these scenarios easier.

Project references allow TypeScript projects to depend on other TypeScript projects – specifically, allowing tsconfig.json files to reference other tsconfig.json files. Specifying these dependencies makes it easier to split your code into smaller projects, since it gives TypeScript (and tools around it) a way to understand build ordering and output structure. That means things like faster builds that work incrementally, and support for transparently navigating, editing, and refactoring across projects. Since 3.0 lays the foundation and exposes the APIs, any build tool should be able to provide this.

What’s it look like?

As a quick example, here’s what a tsconfig.json with project references looks like:

// ./src/bar/tsconfig.json
{
    "compilerOptions": {
        // Needed for project references.
        "composite": true,
        "declaration": true,

        // Other options...
        "outDir": "../../lib/bar",
        "strict": true, "module": "esnext", "moduleResolution": "node",
    },
    "references": [
        { "path": "../foo" }
    ]
}

There are two new fields to notice here: composite and references.

references simply specifies other tsconfig.json files (or folders immediately containing them). Each reference is currently just an object with a path field, and lets TypeScript know that building the current project requires building that referenced project first.

Perhaps equally important is the composite field. The composite field ensures certain options are enabled so that this project can be referenced and built incrementally for any project that depends on it. Being able to intelligently and incrementally rebuild is important, since it’s part of what gives project references a leg up on tools like make. For example, if project front-end depends on shared, and shared depends on core, our APIs around project references can be used to detect a change in core, but to only rebuild shared if the types (i.e. the .d.ts files) produced by core have changed. That means a change to core doesn’t completely force us to rebuild the world. For that reason, setting composite forces the declaration flag to be set as well.

--build mode

TypeScript 3.0 will provide a set of APIs for project references so that other tools can provide this fast incremental behavior. As an example, gulp-typescript already does support it! So project references should be able to integrate with your choice of build orchestrators in the future.

However, for many simple apps and libraries, it’s nice not to need external tools. That’s why tsc now ships with a new --build flag.

tsc --build (or its nickname, tsc -b) takes a set of projects and builds them and their dependencies. When using this new build mode, the --build flag has to be set first, and can be paired with certain other flags:

  • --verbose: displays every step of what a build requires
  • --dry: performs a build without emitting files (this is useful with --verbose)
  • --clean: attempts to remove output files given the inputs
  • --force: forces a full non-incremental rebuild for a project

Controlling output structure

One subtle but incredibly useful benefit of project references is logically being able to map your input source to its outputs.

If you’ve ever tried to share TypeScript code between the client and server of your application, you might have run into problems controlling the output structure.

For example, if client/index.ts and server/index.ts both reference shared/index.ts for the following projects:

src
├── client
│   ├── index.ts
│   └── tsconfig.json
├── server
│   ├── index.ts
│   └── tsconfig.json
└── shared
    └── index.ts

…then trying to build client and server, we’ll end up with…

lib
├── client
│   ├── client
│   │   └── index.js
│   └── shared
│       └── index.js
└── server
    ├── server
    │   └── index.js
    └── shared
        └── index.js

rather than

lib
├── client
│   └── index.js
├── shared
│   └── index.js
└── server
    └── index.js

Notice that we ended up with a copy of shared in both client and server. We unnecessarily spent time building shared and introduced an undesirable level of nesting in lib/client/client and lib/server/server.

The problem is that TypeScript greedily looks .ts files and tries to include them in a given compilation. Ideally, TypeScript would understand these files don’t need to be built in the same compilation, and instead jump to the .d.ts files for type information.

Creating a tsconfig.json for shared and using project references does exactly that. It signals to TypeScript that

  1. shared should be built independently, and that
  2. when importing from ../shared, we should look for the .d.ts files in its output directory.

This avoids triggering a double-build, and also avoids accidentally absorbing all the contents of shared.

Further work

To get a deeper understanding of project references and how you can use them, read up more our issue tracker. In the near future, we’ll have documentation on project references and build mode.

We’re committed to ensuring that other tool authors can support project references, and continuing to improve the experience around editor support using

Extracting and spreading parameter lists with tuples

We often take it for granted, but JavaScript lets us think about parameter lists as first-class values – either by using arguments or rest-parameters (e.g. ...rest).

function call(fn, ...args) {
    return fn(...args);
}

Notice here that call works on functions of any parameter length. Unlike other languages, we don’t need to define a call1, call2, call3 as follows:

function call1(fn, param1) {
    return fn(param1);
}

function call2(fn, param1, param2) {
    return fn(param1, param2);
}

function call3(fn, param1, param2, param3) {
    return fn(param1, param2, param3);
}

Unfortunately, for a while there wasn’t a great well-typed way to express this statically in TypeScript without declaring a finite number of overloads:

// TODO (billg): 4 overloads should *probably* be enough for anybody?
function call<T1, T2, T3, T4, R>(fn: (param1: T1, param2: T2, param3: T3, param4: T4) => R, param1: T1, param2: T2, param3: T3, param4: T4);
function call<T1, T2, T3, R>(fn: (param1: T1, param2: T2, param3: T3) => R, param1: T1, param2: T2, param3: T3);
function call<T1, T2, R>(fn: (param1: T1, param2: T2) => R, param1: T1, param2: T2);
function call<T1, R>(fn: (param1: T1) => R, param1: T2): R;
function call(fn: (...args: any[]) => any, ...args: any[]) {
    fn(...args);
}

Oof! Another case of death by a thousand overloads! Or at least, as many overloads as our users asked us for.

TypeScript 3.0 allows us to better model scenarios like these by now allowing rest parameters to be generic, and inferring those generics as tuple types! Instead of declaring each of these overloads, we can say that the ...args rest parameter from fn must be a type parameter that extends an array, and then we can re-use that for the ...args that call passes:

function call<TS extends any[], R>(fn: (...args: TS) => R, ...args: TS): R {
    return fn(...args);
}

When we call the call function, TypeScript will try to extract the parameter list from whatever we pass to fn, and turn that into a tuple:

function foo(x: number, y: string): string {
    return (x + y).toLowerCase();
}

// `TS` is inferred as `[number, string]`
call(foo, 100, "hello");

When TypeScript infers TS as [number, string] and we end up re-using TS on the rest parameter of call, the instantiation looks like the following

function call(fn: (...args: [number, string]) => string, ...args: [number, string]): string

And with TypeScript 3.0, using a tuple in a rest parameter gets flattened into the rest of the parameter list! The above boils down to simple parameters with no tuples:

function call(fn: (arg1: number, arg2: string) => string, arg1: number, arg2: string): string

So in addition to catching type errors when we pass in the wrong arguments:

function call<TS extends any[], R>(fn: (...args: TS) => R, ...args: TS): R {
    return fn(...args);
}

call((x: number, y: string) => y, "hello", "world");
//                                ~~~~~~~
// Error! `string` isn't assignable to `number`!

and inference from other arguments:

call((x, y) => { /* .... */ }, "hello", 100);
//    ^  ^
// `x` and `y` have their types inferred as `string` and `number` respectively.

we can also observe the tuple types that these functions infer from the outside:

function tuple<TS extends any[]>(...xs: TS): TS {
    return xs;
}

let x = tuple(1, 2, "hello"); // has type `[number, number, string]

There is a subtler point to note though. In order to make all of this work, we needed to expand what tuples could do…

Richer tuple types

Parameter lists aren’t just ordered lists of types. Parameters at the end can be optional:

// Both `y` and `z` are optional here.
function foo(x: boolean, y = 100, z?: string) {
    // ...
}

foo(true);
foo(true, undefined, "hello");
foo(true, 200);

And the final parameter can be a rest parameter.

// `rest` accepts any number of strings - even none!
function foo(...rest: string[]) {
    // ...
}

foo();
foo("hello");
foo("hello", "world");

Finally, there is one mildly interesting property about parameter lists which is that they can be empty:

// Accepts no parameters.
function foo() {
    // ...
}

foo()

So to make it possible for tuples to correspond to parameter lists, we needed to model each of these scenarios.

First, tuples now allow trailing optional elements:

/**
 * 2D, or potentially 3D, coordinate.
 */
type $1oordinate = [number, number, number?];

The Coordinate type creates a tuple with an optional property named 2 – the element at index 2 might not be defined! Interestingly, since tuples use numeric literal types for their length properties, Coordinate‘s length property has the type 2 | 3.

Second, tuples now allow rest elements at the end.

type $1neNumberAndSomeStrings = [number, ...string[]];

Rest elements introduce some interesting open-ended behavior to tuples. The above OneNumberAndSomeStrings type requires its first property to be a number, and permits 0 or more strings. Indexing with an arbitrary number will return a string | number since the index won’t be known. Likewise, since the tuple length won’t be known, the length property is just number.

Of note, when no other elements are present, a rest element in a tuple is identical to itself:

type $1oo = [...number[]]; // Equivalent to `number[]`.

Finally, tuples can now be empty! While it’s not that useful outside of parameter lists, the empty tuple type can be referenced as []:

type $1mptyTuple = [];

As you might expect, the empty tuple has a length of 0 and indexing with a number returns the never type.

The unknown type

The any type is the most-capable type in TypeScript – while it encompasses the type of every possible value, it doesn’t force us to do any checking before we try to call, construct, or access properties on these values. It also lets us assign values of type any to values that expect any other type.

This is mostly useful, but it can be a bit lax.

let foo: any = 10;

// All of these will throw errors, but TypeScript
// won't complain since `foo` has the type `any`.
foo.x.prop;
foo.y.prop;
foo.z.prop;
foo();
new foo();
upperCase(foo);
foo `hello world!`;

function bar(x: string) {
    return x.toUpperCase();
}

There are often times where we want to describe the least-capable type in TypeScript. This is useful for APIs that want to signal “this can be any value, so you must perform some type of checking before you use it”. This forces users to safely introspect returned values.

TypeScript 3.0 introduces a new type called unknown that does exactly that. Much like any, any value is assignable to unknown; however, unlike any, you cannot access any properties on values with the type unknown, nor can you call/construct them. Furthermore, values of type unknown can only be assigned to unknown or any.

As an example, swapping the above example to use unknown instead of any forces turns all usages of foo into an error:

let foo: unknown = 10;

// Since `foo` has type `unknown`, TypeScript
// errors on each of these usages.
foo.x.prop;
foo.y.prop;
foo.z.prop;
foo();
new foo();
upperCase(foo);
foo `hello world!`;

function upperCase(x: string) {
    return x.toUpperCase();
}

Instead, we’re now forced to either perform checking, or use a type assertion to convince the type-system that we know better.

let foo: unknown = 10;

function hasXYZ(obj: any): obj is { x: any, y: any, z: any } {
    return !!obj &&
        typeof obj === "object" &&
        "x" in obj &&
        "y" in obj &&
        "z" in obj;
}

// Using a user-defined type guard...
if (hasXYZ(foo)) {
    // ...we're allowed to access certain properties again.
    foo.x.prop;
    foo.y.prop;
    foo.z.prop;
}

// We can also just convince TypeScript we know what we're doing
// by using a type assertion.
upperCase(foo as string);

function upperCase(x: string) {
    return x.toUpperCase();
}

Support for defaultProps in JSX

Default initializers are a handy feature in modern TypeScript/JavaScript. They give us a useful syntax to let callers use functions more easily by not requiring certain arguments, while letting function authors ensure that their values are always defined in a clean way.

function loudlyGreet(name = "world") {
    // Thanks to the default initializer, `name` will always have type `string` internally.
    // We don't have to check for `undefined` here.
    console.log("HELLO", name.toUpperCase());
}

// Externally, `name` is optional, and we can potentially pass `undefined` or omit it entirely.
loudlyGreet();
loudlyGreet(undefined);

In React, a similar concept exists for components and their props. When creating a new element React looks up a property called defaultProps to fill in any props that were omitted.

// Some non-TypeScript JSX file

import * as React from "react";
import * as ReactDOM from "react-dom";

export class Greet extends React.Component {
    render() {
        const { name } = this.props;
        return <div>Hello ${name.toUpperCase()}!</div>;
    }

    static defaultProps = {
        name: "world",
    };
}

//      Notice no `name` attribute was specified!
//                                     vvvvvvvvv
const result = ReactDOM.renderToString(<Greet />);
console.log(result);

Notice that in <Greet />, name didn’t have to be specified. When a Greet element is created, name will be initialized with "world" and this code will print <div>Hello world!</div>.

Unfortunately, TypeScript didn’t understand that defaultProps had any bearing on JSX invocations. Instead, users would often have to declare properties optional and use non-null assertions inside of render:

export interface Props { name?: string }
export class Greet extends React.Component<Props> {
    render() {
        const { name } = this.props;

        // Notice the `!` ------v
        return <div>Hello ${name!.toUpperCase()}!</div>;
    }
    static defaultProps = { name: "world"}
}

Or they’d use some hacky type-assertions to fix up the type of the component before exporting it.

That’s why TypeScript 3.0, the language supports a new type alias in the JSX namespace called LibraryManagedAttributes. Despite the long name, this is just a helper type that tells TypeScript what attributes a JSX tag accepts. The short story is that using this general type, we can model React’s specific behavior for things like defaultProps and, to some extent, propTypes.

export interface Props {
    name: string
}

export class Greet extends React.Component<Props> {
    render() {
        const { name } = this.props;
        return <div>Hello ${name.toUpperCase()}!</div>;
    }
    static defaultProps = { name: "world"}
}

// Type-checks! No type assertions needed!
let el = <Greet />

Keep in mind that there are some limitations. For defaultProps that explicitly specify their type as something like Partial<Props>, or stateless function components (SFCs) whose defaultProps are declared with Partial<Props>, will make all props optional. As a workaround, you can omit the type annotation entirely for defaultProps on a class component (like we did above), or use ES2015 default initializers for SFCs:

function Greet({ name = "world" }: Props) {
    return <div>Hello ${name.toUpperCase()}!</div>;
}

Breaking changes

You can always keep an eye on upcoming breaking changes in the language as well as in our API.

We expect TypeScript 3.0 to have very few impactful breaking changes. Language changes should be minimally disruptive, and most breaks in our APIs are oriented around removing already-deprecated functions.

unknown is a reserved type name

Since unknown is a new built-in type, it can no longer be used in type declarations like interfaces, type aliases, or classes.

API breaking changes

  • The deprecated internal method LanguageService#getSourceFile has been removed, as it has been deprecated for two years. See #24540.
  • The deprecated function TypeChecker#getSymbolDisplayBuilder and associated interfaces have been removed. See #25331. The emitter and node builder should be used instead.
  • The deprecated functions escapeIdentifier and unescapeIdentifier have been removed. Due to changing how the identifier name API worked in general, they have been identity functions for a few releases, so if you need your code to behave the same way, simply removing the calls should be sufficient. Alternatively, the typesafe escapeLeadingUnderscores and unescapeLeadingUnderscores should be used if the types indicate they are required (as they are used to convert to or from branded __String and string types).
  • The TypeChecker#getSuggestionForNonexistentProperty, TypeChecker#getSuggestionForNonexistentSymbol, and TypeChecker#getSuggestionForNonexistentModule methods have been made internal, and are no longer part of our public API. See #25520.

What’s next?

Since we always keep our Roadmap up-to-date, you can get a picture of what else is in store for TypeScript 3.0 and beyond there.

TypeScript 3.0 is meant to be a foundational release for project references and other powerful type system constructs. We expect it to be available at the end of the month, so until then we’ll be polishing off the release so that you can have the smoothest experience possible. In the meantime, we would love and appreciate any feedback from this RC so we can bring you that experience. Everything we do in TypeScript is driven by our users’ needs, so any contribution you can make by just trying out this release candidate would go a long way.

So feel free to drop us a line on GitHub if you run into any problems, and let others know how you feel about this RC on Twitter and in the comments below!

Happy hacking!


Introducing Workplace Analytics solutions and MyAnalytics nudges

$
0
0

Collaboration habits can make or break teamwork. When people run efficient meetings, create time for focused work, and respect work/life boundaries their teams thrive. Putting these habits in place is difficult and takes the support of the entire team. Data can create a common language to help members build consensus on important teamwork norms. By shedding light on how work actually gets done, organizations can build more efficient, creative, and engaged teams.

Today, we are announcing two new features—Workplace Analytics solutions and MyAnalytics nudges—designed to put individuals and teams at the center of change.

Workplace Analytics solutions—empower teams to master their time

Workplace Analytics uses data from everyday work in Office 365 to identify collaboration patterns that impact productivity, workforce effectiveness, and employee engagement.

Today, we are introducing solutions to help turn organization-wide insights into action plans for individuals and teams. The first solution—Workplace Analytics solution for teamwork—helps teams build better collaboration habits and master their time by guiding organizations through three steps:

Discover collaboration challenges—Use data from everyday work in Office 365, like emails and meetings, to discover challenges like meeting overload, minimal time for focused work, or high after-hours workload. Combine these insights with engagement survey results to find connections between work patterns and indicators of team health like engagement and innovation scores.

Analysis shows that the marketing team spends far more time in meetings than other teams and could benefit from being enrolled in a change program to help reduce meeting hours.

Empower teams to change—Enroll teams in change programs to help them build better habits like bringing agendas to meetings and blocking time for daily focused work. Participants receive personal productivity insights and action plans powered by MyAnalytics.

When teams are enrolled in a change program, members get access to an action plan in MyAnalytics that shows progress towards meeting team goals.

Measure and improve—Make sure your change programs are successful by measuring progress against goals over time. Iterate and improve as you see which action plans succeed or fail in changing teamwork habits.

This team is four weeks into their change program and has already decreased average weekly meeting hours by 11 percent.

The Workplace Analytics solution for teamwork—accessed via the Solutions tab—is now available in preview for customers using both Workplace Analytics and MyAnalytics. Learn more about the Workplace Analytics solution for teamwork.

MyAnalytics nudges—work smarter with data-driven collaboration tips in Outlook

Building better teams starts with transparent, data-driven dialog—but no one is perfect and sticking to good collaboration habits can be challenging in a fast-paced job. Nudges in MyAnalytics can help close the gap by providing friendly, data-driven collaboration tips that surface as you get work done in Office 365.

User sees a notification on a meeting invite that nudges them to set aside time for focused work.

Starting this summer, four types of MyAnalytics nudges will start to surface in Outlook as you read and compose emails and meeting invites.

Get more focus time—Challenging, innovative work requires deep focus and undivided attention. As your calendar fills up with meetings, MyAnalytics will remind you to set aside time for focused work before accepting new invites. You can see available times and block them off without leaving your inbox. MyAnalytics will also notify you when a meeting invite conflicts with a block of focus time that you’ve already scheduled.

Run more effective meetings—Meetings are necessary to get work done, but they often take up the entire week, leaving little time for other work. As you create and receive meeting invites, MyAnalytics will nudge you to put good meeting habits into practice and save precious hours. For example, MyAnalytics will nudge you to delegate coworkers to cover a meeting for you if your schedule is already busy.

Reduce your after-hours impact on coworkers—We announced last year that MyAnalytics summarizes your after-hours impact on coworkers. Now, MyAnalytics will actively nudge you to avoid sending after-hours emails as you draft them to coworkers that you’ve recently impacted outside of regular working hours.

Stay on top of to-do’s and unread email—MyAnalytics already uses AI to remind you of tasks you promised over email to complete for coworkers (and tasks they asked you to get done). Now, MyAnalytics surfaces these reminders as you read emails from coworkers, so you can close out important tasks before taking on new ones. MyAnalytics will also remind you of unread email for your important contacts.

Starting this summer, MyAnalytics users will see nudges on the latest version of Outlook on the web. Users can turn off nudging using the MyAnalytics add-in for Outlook. Read our support article to learn more.

MyAnalytics comes with Office 365 Enterprise E5 and is available as an add-on to other Office 365 enterprise plans. Learn more about MyAnalytics.

The post Introducing Workplace Analytics solutions and MyAnalytics nudges appeared first on Microsoft 365 Blog.

4 new ways Microsoft 365 takes the work out of teamwork—including free version of Microsoft Teams

$
0
0

It’s been one year since we introduced Microsoft 365, a holistic workplace solution that empowers everyone to work together in a secure way. In that time, Microsoft 365 seats have grown by more than 100 percent, building on the more than 135 million commercial monthly Office 365 users, 200 million Windows 10 commercial devices in use, and over 65 million seats of Enterprise Mobility + Security.

This momentum is driven by customers—in every industry—who are transforming their organizations to enable high performance from a workforce that is more diverse, distributed, and mobile than ever before. Microsoft 365 is designed to empower every type of worker—whether on the first lines of a business, managing a small team, or leading an entire organization.

Today, we are introducing four new ways Microsoft 365 connects people across their organization and improves collaboration habits, including extending the power of Microsoft Teams and new AI-infused capabilities in Microsoft 365.

1—Try Microsoft Teams, now available in a free version

To address the growing collaboration needs of our customers, last year we introduced Microsoft Teams, a powerful hub for teamwork that brings together chat, meetings, calling, files, and apps into a shared workspace in Microsoft 365. Now, more than 200,000 businesses across 181 markets use Teams to collaborate and get work done.

Beginning today, Teams is available in a free version worldwide in 40 languages. Whether you’re a freelancer, a small business owner, or part of a team inside a large organization, you can start using Teams today.

The free version includes the following for up to 300 people:

  • Unlimited chat messages and search.
  • Built-in audio and video calling for individuals, groups, and full team meetups.
  • 10 GB of team file storage plus additional 2 GB per person for personal storage.
  • Integrated, real-time content creation with Office Online apps, including built-in Word, Excel, PowerPoint, and OneNote.
  • Unlimited app integrations with 140+ business apps to choose from—including Adobe, Evernote, and Trello.
  • Ability to communicate and collaborate with anyone inside or outside your organization, backed by Microsoft’s secure, global infrastructure.

This new offering provides a powerful introduction to Microsoft 365. Teams in Microsoft 365 includes everything in the free version plus additional storage, enterprise security, and compliance, and it can be used for your whole organization, regardless of size.

As we advance our mission to empower every person and organization on the planet to achieve more, what’s most exciting are the stories of customers taking on big projects with a small workforce, such as The Hustle Media Company—who helps movers, shakers, and doers make their dent in the world. Their popular daily email provides their audience with the tech and business news they need to know.

“As a media company that nearly quadrupled in size over the last year, it became apparent we needed a solution to connect all of The Hustle’s offices. As previous Slack users, we found that Microsoft Teams has all the features that other chat-based apps bring, but the teamwork hub allows everything to live in one place.”
—Adam Ryan, vice president of Media at The Hustle

Or, take it from Urban Agriculture Company, a small business specializing in organic, easy-to-use grow kits of vegetables, flowers, and herbs. After landing on Oprah’s favorite things, founder Chad Corzine turned to Microsoft 365 Business and Teams to manage communication among his rapidly growing departments, onboard employees, and protect customer data.

2—Use new intelligent event capabilities in Microsoft 365

Today, we’re also introducing new capabilities that allow anyone in your organization to create live and on-demand events in Microsoft 365. Events can be viewed in real-time or on-demand, with high-definition video and interactive discussion.

AI-powered services enhance the on-demand experience with:

  • A speaker timeline, which uses facial detection to identify who is talking, so you can easily jump to a particular speaker in the event.
  • Speech-to-text transcription, timecoding, and transcript search, so you can quickly find moments that matter in a recording.
  • Closed captions to make the event more accessible to all.

Events can be as simple or as sophisticated as you prefer. You can use webcams, content, and screen sharing for informal presentations, or stream a studio-quality production for more formal events.

3—Leverage analytics to build better collaboration habits

We’re rolling out the preview of a new Workplace Analytics solution, which uses collaboration insights from the Microsoft Graph, to help teams run efficient meetings, create time for focused work, and respect work/life boundaries. Organizations can use aggregate data in Workplace Analytics to identify opportunities for improving collaboration, then share insights and suggest habits to specific teams using MyAnalytics.

We’re also rolling out nudges, powered by MyAnalytics in Microsoft 365, which deliver habit-changing tips in Outlook, such as flagging that you’re emailing coworkers after hours or suggesting you book focused work time for yourself.

4—Work with others on a shared digital canvas with Microsoft Whiteboard

Microsoft Whiteboard is now generally available for Windows 10, coming soon to iOS, and preview on the web. Whether meeting in person or virtually, people need the ability to collaborate in real-time. The new Whiteboard application enables people to ideate, iterate, and work together both in person and remotely, across multiple devices. Using pen, touch, and keyboard, you can jot down notes, create tables and shapes, freeform drawings, and search and insert images from the web.

Get started

Whether you’re managing a new project or creating your own business, it helps to have your team behind you to brainstorm ideas, tackle the work together, and have some fun along the way. Take your teamwork to the next level and start using Teams today.

Learn more about how Microsoft 365 enables teamwork below:

The post 4 new ways Microsoft 365 takes the work out of teamwork—including free version of Microsoft Teams appeared first on Microsoft 365 Blog.

Azure DevOps Project general availability

$
0
0
During our Connect(); 2017 event, we announced the public preview of Azure DevOps Projects to help customers start running applications on any Azure service in just three steps. Today, we’re excited to announce that Azure DevOps Projects is now generally available in the Azure Portal, making it easier for developers to deploy to the Azure cloud and create CI/CD pipelines... Read More

New open data sets from Microsoft Research

$
0
0

Microsoft has released a number of data sets produced by Microsoft Research and made them available for download at Microsoft Research Open Data.  

The Datasets in Microsoft Research Open Data are categorized by their primary research area, such as Physics, Social Science, Environmental Science, and Information Science. Many of the data sets have not been previously available to the public, and many are large and useful for research in AI and Machine Learning techniques. Many of the datasets also include links to associated papers from Microsoft Research. For example, the 10Gb DESM Word Embeddings dataset provides the IN and the OUT word2vec embeddings for 2.7M words trained on a Bing query corpus of 600M+ queries.

Other data sets of note include:

  • A collection of 38M tweets related to the 2012 US election
  • 3-D capture data from individuals performing a variety of hand gestures
  • Infer.NET, a framework for running Bayesian inference in graphical models
  • Images for 1 million celebrities, and associated tags
  • MS MARCO, is a new large-scale dataset for reading comprehension and question answering

Most data sets are provided as plain text files, suitable for importing into Python, R, or other analysis tools. In addition to downloading the data, you can also deploy the datasets for analysis in to Microsoft Azure: see the FAQ for details on this and other ways the datasets can be used.

As of this writing, the archive includes 51 datasets. You can explore and download the Microsoft Research datasets at the link below.

Microsoft: Microsoft Research Open Data

Enabling partner success at Microsoft Inspire 2018

$
0
0

Next week in Last Vegas, we will be meeting with thousands of partners from all over the world at Microsoft Inspire. The event is a forum to reinforce our commitment to making partner business opportunities the centerpiece of our investments. We continue to extend our cloud marketplace, Azure Marketplace and AppSource, capabilities to make it the best place for partners of any type to be discovered, evaluated, and ultimately have their offerings purchased and deployed by our customer base. The event also gives us the opportunity to showcase the investments we’re making to enable these same capabilities within our channel ecosystem both now and in the future. 

Empowering partner-to-partner joint selling

Next year, for the first time in Microsoft’s history, we will be putting partner solutions, applications as well as consulting and managed service offers, in our product catalog alongside our Microsoft products and services. This means our direct salesforce and partner ecosystem will be able to sell first and third-party solutions individually or combined.

For partners with applications and solutions listed in one of our marketplaces, this will enable joint selling opportunities connecting them with thousands of Cloud Solution Providers around the world. In turn, Cloud Solution Providers will have a dramatically expanded portfolio of offerings, from both Microsoft and our partner ecosystem, that they can combine alongside their own managed services to provide even greater value for their customers.

To take advantage of the partner-to-partner opportunity ahead, we encourage partners to publish their solutions and offers in one of the storefronts, Azure Marketplace and AppSource, today. These applications and services will get priority placement in our combined product catalog when the new capabilities launch.

Integrated Solutions now available in Azure Marketplace

This month, we released a set of multi-partner Integrated Solutions across three vertical areas with more coming soon.  Available through Azure Marketplace, these solutions ease many of the challenges partners face coordinating and assembling multiple point solutions to address customer needs. For customers, this means a unified buying and faster implementation experience, facilitated by a systems integrator (SI). For partners, it means delivering more value to customers in a much faster and repeatable way. To support partners in scaling these integrated solutions to more quickly meet customers’ specific needs, we also provide Azure credits to partners for their customer proofs-of-concept. New Integrated Solutions available today include:

  • Customer 360 Powered by Zero2Hero by Bardess, Cloudera, Trifacta, and Qlik. For financial services and retail organizations looking to rapidly gain a complete view of their customers for actionable insights.
  • Rapid Commerce by Publicis.Sapient, DataStax, and Acquia. A microservices commerce platform for retailers to enable an improved customer experience and business results.
  • Healthcare Cloud Security Stack by XentIT, TrendMicro, and Qualys. A unified cloud threat management solution for healthcare entities looking to maintain cybersecurity and HIPAA compliance in the cloud. 
  • Credit Analytics by RCG Global Services, Cazena, Trifacta, and Cloudera. Helping financial services organizations to manage loan portfolio risk, maximize risk-adjusted rate of return, minimize capital reserve requirements, and strengthen compliance.

Cloud+AI Partner Community now live

Our goal is to continually grow the ways we serve our partners. Regardless of whether they have technical, business or any question in between, our aim is to provide our partners with a single place to find what they need. To this end, we are launching the Cloud+AI Partner Community to act as the primary online location for partners to access the information they need. 

Cloud AI Partner Community

This will connect to the latest technical documentation, give partners access to experts in the field, and help partners learn from one another. 

We look forward to sharing more detail and hearing feedback directly from partners in Las Vegas.

Dell debuts ‘world’s most powerful 1U rack workstation’

$
0
0

Dell delivers compact solutions that pack a punch, including a 1U rack workstation and towers starting at $649.

Companies of all sizes and budgets looking for powerful, affordable, compact industry-leading workstations now have new choices for their needs from Dell, including the world’s most powerful 1U rack workstation.¹

The Dell Precision 3930 Rack delivers powerful performance in a compact industrial footprint. Its 1U rack height delivers better rack density and extended operating temperatures, while features such as its short depth, dust filters, and legacy ports allow it to integrate seamlessly into complex medical imaging and industrial automation solutions.

Other features include:

  • The rack workstation provides up to 64 GB of 2666MHz DDR4 memory thanks to the introduction of Intel Xeon E processors and an 8th Generation Intel Core CPU.
  • The Intel Xeon E processor supports Error Correcting Code (ECC) for increased reliability.
  • The rack workstation offers best-in-class workstation performance and provides the flexibility of up to 250W of doublewide GPUs, and scalability with up to 24 TB of storage.
  • With 3 PCIe slots, including an optional PCI slot, this workstation can tackle complex tasks with ease.
  • A range of NVIDIA Quadro professional GPUs are available. With the Quadro P6000, users benefit from 24GB of GDDR5X and powerful ultra-high-end graphics performance.
  • In addition, customers have the option to choose AMD Radeon Pro graphics.

If your company is looking for versatile, secure, and fast remote 1:1 user access, you can add optional Teradici PCOIP technology. The rack workstation effortlessly integrates into the datacenter, which helps reduce clutter at your desk.

A smaller footprint that doesn’t skimp on performance

Going small can lead to big things, so Dell has built these new entry-level workstations to fuel the future of innovation across engineering design, science, mathematics, and other data- and graphics-intensive fields. Running a highly powerful machine no longer requires having a large work space or a large budget, making this level of performance available to many companies and workers for the first time.

“Customers across Engineering & Manufacturing, Media & Entertainment, and beyond have come to rely on Dell workstations to deliver the highest performing systems for their critical workload. But as we enter the next era of workstations, the conversation is accelerating to immersive workflows utilizing even smaller footprints. Dell is leading the way in this evolution with these new entry-level workstations designed to deliver the ultimate in performance with a substantially smaller footprint,” says Rahul Tikoo, vice president and general manager of Dell Precision. “When access to leading technology improves, innovation flourishes. Sometimes something as simple as a smaller form factor can unleash new ideas and capabilities that have the power to reshape an industry.”

The Dell Precision 3630 Tower is 23 percent smaller² than the previous generation with more expandability, so workers can get the precise solution they need regardless of workspace constraints. It features a range of easy-to-reach ports that make it possible to connect to external data sources, storage devices, and more. It offers scalable storage featuring SATA and PCIe NVMe SSDs, which can be configured for up to 14 TB with RAID support.

As workstation users often create intellectual property, Dell will also offer an optional Smart Card (CAC/PIV) reader to make secure data management easier.

If you’re interested in creating or enjoying VR experiences and other resource-intensive tasks, this workstation is a good choice, thanks to an 8th Generation Intel Core i and new professional-grade Xeon E processors with faster memory speeds up to 2666MHz 64 GB. It also offers up to 225W of NVIDIA Quadro and AMD Radeon Pro graphics support.

The new Dell Precision 3430 Small Form Factor Tower is a great fit for many workstation users, offering many of the same benefits as the Precision 3630, but in an even smaller form factor and up to 55W of graphics support. It’s also expandable with up to 6TB of storage with RAID support.

Dell also introduced support for Intel Core Xseries processors in addition to the Intel Xeon W processor options already available on the Dell Precision 5820 Tower. These new processor options bring the enhanced performance and reliability of a workstation at a more affordable price point for customers.

Adding Intel Optane memory keeps responsiveness high and high-capacity storage costs lower on all these new Dell Precision 3000 series workstations. Customers can expect the same build quality and reliability of the Dell Precision line.

Available now: The Dell Precision 3430 Small Form Factor Tower and the Dell Precision 3630 Tower (both starting at $649) and the Dell Precision 5820 Tower workstation.

Available worldwide July 26, 2018: The Dell Precision 3930 Rack, which starts at $899.

 

¹When equipped with Intel Xeon E-2186G processor, available 64GB of 2666MHz memory capacity, NVIDIA P6000 graphics and 2xM.2PCIe storage. Based on Dell internal analysis of competitive workstation products as of July 2018.

² Gen-over-Gen claim based on internal testing, July 2018.

The post Dell debuts ‘world’s most powerful 1U rack workstation’ appeared first on Microsoft 365 Blog.

Top Stories from the Microsoft DevOps Community – 2018.07.13

$
0
0
It’s been another busy week for VSTS and DevOps, and we’re excited to see some interesting articles and podcasts about DevOps on Azure. Moving a Git Repo from Bitbucket to VSTS One of the great things about Git is that it’s easy to move your repository from one hosting provider to another – and maintain... Read More

Because it’s Friday: Language and Thought

$
0
0

Does the language we speak change the way we think? This TED talk by Lera Boroditsky looks at how language structures like gendered nouns, or the way directions are described, might shape they way speakers of those languages think about things:

This talk was cited by Bill Venables in his excellent keynote talk at the useR!2018 conference, where he also presented the results of an elegant designed experiment that looked at the influence of gendered nouns by comparing languages without gendered nouns (like Hungarian) with those that have multiple genders for nouns like Spanish (which has feminine and masculine nouns) and German (which adds neuter nouns, for three genders total).

That's all from us for this week. Have a great weekend, and we'll be back next week. Enjoy!

Lynx is dead – Long live Browsh for text-based internet browsing

$
0
0

The standard for browsing the web over a text-=based terminal is Lynx, right? It's the legendary text web browser that you can read about at https://lynx.invisible-island.net/ or, even better, run right now with

docker run --rm -it nbrown/lynx lynx http://hanselman.com/

Awesome, right? But it's text. Lynx runs alt-text rather than images, and doesn't really take advantage of modern browser capabilities OR modern terminal capabilities.

Enter Browsh! https://www.brow.sh/

Browsh is a fully-modern text-based browser. It renders anything that a modern browser can; HTML5, CSS3, JS, video and even WebGL. Its main purpose is to be run on a remote server and accessed via SSH/Mosh

Imagine running your browser on a remote machine connected to full power while ssh'ing into your hosted browsh instance. I don't know about you, but my laptop is currently using 2 gigs of RAM for Chrome and it's basically just all fans. I might be able to get 12 hours of battery life if I hung out in tmux and used browsh! Not to mention the bandwidth savings. If I'm tethered or overseas on a 3G network, I can still get a great browsing experience and just barely sip data.

Browsing my blog with Browsh

You can even open new tabs! Check out the keybindings! You gotta try it. Works great on Windows 10 with the new console. Just run this one Docker command:

docker run -it --rm browsh/browsh

If you think this idea is silly, that's OK. I think it's brilliant and creative and exactly the kind of clever idea the internet needs. This solves an interesting browser in an interesting way...in fact it returns us back to the "dumb terminal" days, doesn't it?

There was a time when I my low-power machine waited for text from a refrigerator-sized machine. The fridge did the work and my terminal did the least.

Today my high-powered machine waits for text from another high-powered machine and then struggles to composite it all as 7 megs of JavaScript downloads from TheVerge.com. But I'm not bitter. ;)

Check out my podcast site on Browsh. Love it.

Tiny pixelated heads made with ASCII

If you agree that Browsh is amazing and special, consider donating! It's currently maintained by just one person and they just want $1000 a month on their Patreon to work on Browsh all the time! Go tell Tom on Twitter that you think is special, then give him some coins. What an exciting and artful project! I hope it continues!


Sponsor: Scale your Python for big data & big science with Intel® Distribution for Python. Near-native code speed. Use with NumPy, SciPy & scikit-learn. Get it Today!



© 2018 Scott Hanselman. All rights reserved.
     

Supercharging the Git Commit Graph IV: Bloom Filters

$
0
0
We’ve been discussing the commit-graph feature in Git 2.18 and how we can use generation numbers to accelerate commit walks. One area where we can get significant speedup is when presenting output in topological order. This allows us to walk a much smaller list of commits than before. One place where this breaks down is... Read More

Spring Data Gremlin for Azure Cosmos DB Graph API

$
0
0

We are pleased to announce that Spring Data Gremlin is now available on Maven Central and source code on GitHub. This project provides Spring Data support for Graph databases that use Gremlin as a traversal language. For Azure users, it provides a delightful experience to interact with Azure Cosmos DB using Graph API. Java developers can now use Spring annotations to map with database entities, such as Vertex and Edge. It also allows developers to create their own database queries based on Spring Data Repository.

Key features

Spring Data Gremlin supports annotation-oriented programming model to simplify the mapping to the database entity. It also provides supports to create custom queries.

  • Mapping with database entities, including annotations.

    • @Id to map a filed in the domain class with id field of an entity.
    • @Vertex and @VertexSet map to Vertex.
    • @Edge, @EdgeSet, @EdgeFrom and @EdgeTo to map with Edge

  • Advanced operations with Vertex, Edge and Graph.

    • findVertexById and findEdgeById to find data by specific field.
    • vertexCount and edgeCount for counting Vertex and Edge numbers.

  • Spring Data CRUD functionality.

  • Custom database queries based on Spring Data Repository.

Getting started

You can follow this tutorial to create a Graph database instance in Azure Cosmos DB and add a new Graph in Data Explorer. Use the URI and Keys on the overview page to configure your project.

1st snapshot

Configure your project

Include the Spring Data Gremlin library in your project, by opening the pom.xml file and adding the following settings in the <dependencies> section. Then set up the application.yml file to connect with the Graph database you just created.

<dependency>
     <groupId>com.microsoft.spring.data.gremlin</groupId>
     <artifactId>spring-data-gremlin</artifactId>
     <version>2.0.0</version>
</dependency> 

More detailed configurations are available on GitHub.

Run it

Get started with our sample! Once you’ve executed it, you can open the Azure Portal to check the updated Graph in your database.

image

When it's done, read about our project on GitHub and discover more advanced features.

Next steps

More information about using Spring on Azure, visit the following pages:

Feedback

Feedback and questions help us to improve. You can share your thoughts by contacting us on GitHub.

Azure.Source – Volume 40

$
0
0

Highlights

New Azure innovation advances customer success for the cloud- and AI-powered future - In preparation for Microsoft Inspire in Las Vegas this week (where partners and Microsoft meet to connect, learn, and collaborate to accelerate the digital transformation and success of our customers), Jason Zander (Executive Vice President, Microsoft Azure) highlighted new offers and technologies that will help organizations navigate their own digital transformation and build the critical foundations for cloud and AI. He showcases momentum from businesses and partners who are investing in intelligent tech from the cloud to the edge. This roundup post provides an overview of news items included below, including Azure Data Box Disk, Azure Virtual WAN, Azure Firewall, Azure SQL Data Warehouse performance, an offer for customers of Windows Server and SQL Server 2008/2008 R2, Azure Stack, and more.

Announcing new offers and capabilities that make Azure the best place for all your apps, data, and infrastructure - Windows Server and SQL Server 2008/2008 R2 are nearing the end of support in January 2020 and July 2019, respectively. Julia White (Corporate Vice President, Microsoft Azure) explains how you can migrate to Azure with free Windows Server and SQL Server 2008 extended security updates for an additional three years by moving those servers to Azure. In addition, for customers running any version of Windows Server or SQL Server, Azure Hybrid Benefit helps you save money with Azure like no other cloud can by providing up to 55 percent savings of running Windows Server or SQL Server on Azure.

Announcing new options for SQL Server 2008 and Windows Server 2008 End of Support - Takeshi Numoto (Corporate Vice President, Cloud + Enterprise) provides further details on the end of support of these products, and the 2008 End of Support Resource Center. Customers with active Software Assurance or subscription licenses can purchase Extended Security Updates annually for 75 percent of the full license cost of the latest version of SQL Server or Windows Server, or they can receive free Windows Server and SQL Server 2008 extended security updates for an additional three years by moving those servers to Azure.

Azure sets new performance benchmarks with SQL Data Warehouse - Raghu Ramakrishnan (Chief Technology Officer, Azure Data) announces that Azure SQL Data Warehouse has set new performance benchmarks for cloud data warehousing by delivering at least 2x faster query performance compared to before. Gigaom Research recently ran benchmark tests between Azure SQL Data Warehouse and Amazon Redshift using a workload derived from the well-recognized industry standard TPC Benchmark™ H (TPC-H). According to the Gigaom research, Azure SQL Data Warehouse ran 30 TB workloads at least 67 percent faster than Amazon Redshift. Azure SQL Data Warehouse also ensures that data is accessible across your organizations. We enhanced the service to support 128 concurrent queries so that more users can query the same database and not get blocked by other requests. In comparison, Amazon Redshift limits maximum concurrent queries to 50, limiting data access within the organization.

Vertical bar chart comparing query run time in  for 30 TB data on Amazon Redshift and Azure SQL DW

Now in preview

Welcome our newest family member - Data Box Disk - Azure Data Box Disk provides a simple, secure, SSD disk-based offering for offline data transfer to Azure. Customers that have data transfer requirements but lack the onsite technical expertise required for more robust data transfer offerings can use the Data Box Disk service to transport as much as 40 TB of AES 128-bit encrypted data into Azure by simply connecting the disks to a computer via USB or SATA and using drag-and-drop or Robocopy commands for data transfer. The free Data Box Disk Preview is available in the EU and US. Separately, Azure Data Box, currently in preview, has expanded the preview to Europe and the UK. 

Image of two Azure Data Box Disks

Announcing public preview of Azure Virtual WAN and Azure Firewall - Learn how you can build a secure, high-performant, and reliable global network in the cloud for your mission-critical services. Azure Virtual WAN is a networking service providing optimized and automated branch to branch connectivity that makes it easy to connect to and through Azure by seamlessly connecting their branches to each other and Azure using last mile Internet. Azure Firewall is a cloud-native network security service that protects your Azure Virtual Network resources. It is a fully stateful firewall-as-a-service with built-in high availability and unrestricted cloud scalability. Customers can centrally create, enforce, and log application and network connectivity policies, spanning Fully Qualified Domain Names (FQDNs), IP Addresses, ports and protocols across subscriptions and virtual networks.

Introducing Dev Spaces for AKS - Gabe Monroy (PM Lead, Containers, Microsoft Azure) announces the public preview of Dev Spaces for Azure Kubernetes Services (AKS), an easy way to build and debug applications for Kubernetes. Though Dev Spaces is only available on Azure, the scaffolding builds upon the open source Draft project, which provides an inner-loop experience on any Kubernetes cluster. Learn how you can rapidly iterate on code changes, remotely debug containers, and easily bootstrap new applications.

Azure NetApp Files now in public preview - Andrew Chen (Principal PM Manager) announces the public preview of Azure NetApp Files, an Azure native service powered by NetApp's leading ONTAP technology, which has been long trusted by enterprises to meet their requirements for performance, scalability, data management, security, and hybrid capability. NetApp enables enterprises to overcome the challenge of deploying file-based workloads to the cloud. During preview, Azure NetApp Files Azure NetApp Files is available in East US and will soon be offered in West US 2, and it will be offered at a reduced price while in preview.

Also in preview

Now generally available

General availability of user behavior analytics tools in Azure Application Insights - Introduced at Microsoft Build 2017, the user behavior analytics tools for Application Insights is now generally available. Users, Sessions, Events, and the other user behavior analytics tools are part of each Application Insights resource in the Azure portal. In this post, you can learn what's new since the release to preview last year.

Screenshot of a sample User Flows diagram in Azure Application Insights

Power BI Embedded dashboards with Azure Stream Analytics - Power BI Embedded simplifies how ISVs and developers can quickly add stunning visuals, reports, and dashboards to their apps. Using Power BI with Azure Stream Analytics enables users of Power BI Embedded dashboards to easily visualize insights from streaming data within the context of the apps they use every day. With Power BI Embedded, users can also embed real-time dashboards right in their organization's web apps.

Also generally available

The Azure Podcast

The Azure Podcast: Episode 237 - DevOps with AKS - A great in-person discussion with two of my colleagues at Microsoft, Daniel Selman who is a Kubernetes Consultant and Razi Rais who focusses on DevOps and Blockchain. We talk about how a Developer can quickly go through the motions to package and deploy applications to AKS and monitor them as they run.

News and updates

Route Matrix, isochrones, IP lookup, and more added to Azure Maps - Chris Pendleton (Principal PM Manager, Azure Maps) announces several new capabilities in Azure Maps for building mission-critical, large-scale apps for your mobility, IoT and enterprise solutions, including: Matrix Routing for determining the travel time and distances from a set of origins to a set of destinations; isochrones (or Route Range) to determine the distance from a single point that a user can reach based on time; batch geocoding enables customers to post bulk addresses for geocoding or finding business locations in a single request; batch routing service enables customers to post a list of origins and destinations to the batch route service to return the estimated time and distance to and from each respective origin and destination; new map canvasses, including satellite imagery; new map control modules; return polygons for area features, such as administrative districts (e.g., counties); and the ability to map IP addresses.

Azure HDInsight now supports Apache Spark 2.3 - Apache Spark 2.3.0 is now available for production use on the managed big data service Azure HDInsight. Ranging from bug fixes (more than 1400 tickets were fixed in this release) to new experimental features, Apache Spark 2.3.0 brings advancements and polish to all areas of its unified data platform. Maxim Lukiyanov (Sr. PM Manager, Azure HDInsight) provides details on this release, which has something for everyone, including data engineers, data scientists, business analysts, and developers.

Kafka 1.0 on HDInsight lights up real-time analytics scenarios - Dhruv Goel (PM, Azure Big Data) covers the release of Apache Kafka 1.0 on HDInsight, including idempotent producers so that you don’t have to deduplicate; transactions for multi-topic and multi-partition atomic writes; exactly once semantics, and support for record headers.

Announcing Azure Expert MSPs - Corey Sanders (Corporate Vice President, Azure) expands on the recent announcement of the Azure Expert Managed Services Provider (MSP) program, which gives our most capable Azure MSPs full support to help drive revenue for themselves—and their customers. These expert partners have proven real-world proficiency and skills, for datacenter lift-and-shift, born-in-cloud new applications, and everything in-between. Annually, Azure Expert MSPs complete a rigorous audit by an independent, third-party auditor, in which they must provide multiple customer references for Azure managed services projects delivered over the last 12 months.

Latest updates to Azure Database for PostgreSQL - Sudhakar Sannakkayala (General Manager, Azure Data) provides an update on Azure Database for PostgreSQL, which released to general availability in March of this year, including: virtual network service endpoints in public preview; support for large databases (up to 4 TB) and increased IOPS (up to 6000 IOPS); Geo Restore for geo-redundant backups and recovery is now GA; support for PostgreSQL 10.3 is now GA; migration support with Azure Database Migration Service (DMS) preview; memory-optimized pricing tier is now GA; and regional availability increased to 28 countries/regions worldwide.

Latest updates to Azure Database for MySQL - Sudhakar Sannakkayala (General Manager, Azure Data) provides a similar update on Azure Database for MySQL, with most of the capabilities outlined above for Azure Database for PostgreSQL, but adds Data-in Replication to synchronize data from a MySQL server running on-premises, in virtual machines, or database services outside Azure into Azure Database for MySQL. As you would expect, the updates for Azure Database for MySQL do not include support for PostgreSQL 10.3.

PHP Minor Version + Xdebug Update for August 2018 - In August 2018, we will update the Azure App Service PHP stacks to the latest available versions (7.1.19 and 7.2.7). In addition, we will also update the Xdebug binaries, including support for PHP 7.1 and 7.2. Xdebug paths are changing, too. See this post from the App Service Team to get the full details.

Additional news and updates

Azure Friday

Azure Friday | Episode 450 - Azure SignalR Service - Xiaokai He joins Scott Hanselman to discuss Azure SignalR Service. See how easy it is to start building your real-time web application without setting up your own SignalR server.

Azure Friday | Episode 451 - Azure Reserved VM Instances (RIs) - Yashesvi Sharma joins Scott Hanselman to discuss how to significantly reduce costs up to 72 percent compared to pay-as-you-go prices with one-year or three-year terms on Windows and Linux virtual machines (VMs). When you combine the cost savings gained from Azure RIs with the added value of the Azure Hybrid Benefit, you can save even more.

Technical content

Lightning fast query performance with Azure SQL Data Warehouse - At the heart of every distributed database system is the need to align two or more tables that are partitioned on a different key to produce a final or intermediate result set. Data movement is an operation where parts of the distributed tables are moved to different nodes during query execution. This operation is required where the data is not available on the target node, most commonly when the tables do not share the distribution key. The most common data movement operation is a shuffle. Tomas Talius (Partner Architect, Azure Data) goes under the hood of Azure SQL Data Warehouse (SQL DW) to see how the shuffling speed has improved to provide significant query performance improvements.

Events

Enabling partner success at Microsoft Inspire 2018 - Microsoft Inspire 2018 kicks off today, July 16th, in Las Vegas. Charlotte Yarkoni (Corporate Vice President, Growth + Ecosystems) outlines how we continue to extend our cloud marketplace, Azure Marketplace, and AppSource, capabilities to make it the best place for partners of any type to be discovered, evaluated, and ultimately have their offerings purchased and deployed by our customer base. Next year, for the first time in Microsoft’s history, we will be putting partner solutions, applications as well as consulting and managed service offers, in our product catalog alongside our Microsoft products and services. This month, we released a set of multi-partner Integrated Solutions across three vertical areas through Azure Marketplace, such as Customer 360 Powered by Zero2Hero (by Bardess, Cloudera, Trifacta, and Qlik) and Rapid Commerce (by Publicis.Sapient, DataStax, and Acquia). She also announced the launch of the Cloud+AI Partner Community to act as the primary online location for partners to access the information they need.

Screenshot of Cloud+AI Partner Community website

Azure webinar series: Connecting, Building, and Protecting Your Network - Join us for this webinar on Wednesday, July 18, 2018 (10:00 AM - 11:00 AM Pacific Time) to learn: The Microsoft vision for the future of the cloud-centric network, about two new services that target the changing boundaries of the application landscape, and how our partners and customers are adapting their networks and security to a cloud-based model.

Streaming analytics use cases with Spark on Azure - Join the Streaming Analytics Use Cases on Apache Spark on-demand webinar to learn how to get insights from your data in real-time and see a walk you through of two Spark Streaming use case scenarios: IoT Analytics and Clickstream Analytics.

Service Fabric Community Q&A 26th Edition - The Service Fabric Team will be holding their 26th monthly community Q&A call this Thursday, July 19th at 10:00 AM Pacific Time. As always, there is no need to RSVP. Just go to http://aka.ms/sfcommunityqa at 10:00 AM Thursday and you’re in.

Customers and partners

Azure IoT Central available for partners to manage and extend customer solutions - Sunil Tahilramani (Senior Product Manager, Azure IoT) covers a number of features that make it easy for partners to resell, manage and extend customer solutions, such as Azure IoT Central are now available in Azure CSP subscriptions; enable powerful workflows and new scenarios including Connected Field Service; unlocking data from IoT Central for deeper insights; and a Power BI solution template.

Consulting offers in Azure Marketplace arrive in Canada - Whether you are seeking someone to help learn new skills in Azure, handle the unique parameters of migrating a sensitive workload, or create high-impact visual storytelling to drive business decision making, Azure Marketplace makes accessing this expertise easier than ever. The partners in Azure Marketplace can help you get started with confidence to right-size and optimize your cloud. We added Consulting Services offers to Azure Marketplace for US customers in May. Consulting Services offers are now available in Canada, too.

How Microsoft Azure Stack means transformation and opportunities for partners - Talal Alqinawi (Senior Director, Azure Marketing) outlines how the partner ecosystem has been instrumental in unlocking some amazing scenarios where customers are using Azure Stack in remote, often disconnected, environments to deliver value to businesses and communities. Learn how your role as a partner can cover one or all of the phases in the journey of an Azure Stack Customer.

Azure Marketplace new offers June 1–15 - The Azure Marketplace is the premier destination for all your software needs – certified and optimized to run on Azure. Find, try, purchase, and provision applications & services from hundreds of leading software providers. In the first half of June, we published 21 new offers, including OutSystems on Microsoft Azure, SoftNAS Cloud Enterprise, and Windows Server 2019 Datacenter Preview.

Additional customers and partners

The IoT Show

The IoT Show | Dynamics Connected Field Service for IoT - Kyle Young, Principal PM Manager in the Dynamics team shows us how the integration between Azure IoT Hub and Dynamics Connected Field Service allows automating workflows and bridge the world of IoT with the enterprise.

The IoT Show | Connecting Legacy Systems to Modern IoT Apps - Building IoT applications often requires connecting existing systems and infrastructures to the Cloud. Chris Segura, PM in the Azure IoT team, tells us what that entails and how Azure IoT Edge can be used to bridge legacy systems to modern IoT applications powered by the Cloud.

Developer spotlight on Dev Spaces

Quickstart: Create a Kubernetes dev space with Azure Dev Spaces (.NET Core and Visual Studio) - Follow this quickstart to learn how to set up Azure Dev Spaces with a managed Kubernetes cluster in Azure. iteratively develop code in containers using Visual Studio, and debug code running in your cluster.

Quickstart: Create a Kubernetes dev space with Azure Dev Spaces (Node.js and VS Code) - Follow this quickstart to learn how to set up Azure Dev Spaces with a managed Kubernetes cluster in Azure. iteratively develop code in containers using VS Code, and debug code running in your cluster.

SmartHotel360 Microservices on Azure Kubernetes Service - To help you learn how to deploy microservices written in any framework to AKS, we updated the SmartHotel360 back-end microservices source code and deployment process to optimize it for AKS. Clone, fork, or download the AKS and Azure Dev Spaces demo on GitHub to learn more.

A Cloud Guru - Azure This Week

A Cloud Guru | Azure This Week - 13 July 2018 - In this episode of Azure This Week, Dean takes a look at the Public Preview of Immutable Blob Storage, General Availability of user behaviors in Application Insights, and there’s a new free e-book on serverless architectures.

Viewing all 10804 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>