Quantcast
Channel: Category Name
Viewing all 10804 articles
Browse latest View live

Time synchronization for financial services in Azure

$
0
0

You know the expression: “Time is money.” For many workloads in the capital markets space, time accuracy is money. Depending on the application and applicable regulation, financial transactions need to be traceable down to the second, millisecond, or even microsecond. Financial institutions themselves are under scrutiny to prove the validity and traceability of these transactions. At Microsoft, we want to ensure our customers are aware of time accuracy and synchronization best practices, and how you can mitigate the risk of negative impact due to time synchronization issues on Azure.

Time accuracy for a computer clock generally refers to how close the computer clock is to Coordinated Universal Time (UTC), the current time standard. In turn, UTC is based on International Atomic Time (TAI), a measure of time that combines the output of some 400 atomic clocks worldwide that yield approximately 1 second of deviation from mean solar time at 0° longitude. While the most precise time accuracy can be achieved by reading the time from these reference clocks themselves, it is impractical to have a GPS receiver attached to every machine in a datacenter. Instead, a network of time servers, downstream from these systems of record, are used to achieve scalability. Each computer checks its own time against these time servers and adjusts time if needed. This is time synchronization in a nutshell.

Computer clocks are susceptible to time drift that is imperceptible to most users, but can have costly consequences. The technology clocks rely on are typically either inexpensive oscillator circuits or battery backed quartz crystals that can cause seconds of drift per day which, over time, can accumulate into significant discrepancies in time. This can be mitigated using a timekeeping program that checks with external sources periodically, corrects any drift that occurs within the computer clock, and synchronizes time globally. This must be implemented by systems that are optimized for sub-second accuracy. Certain FSI applications like high-frequency trading have a very low tolerance for time drift. Regulations like MiFID II in the E.U., the U.S. Securities and Exchange Commission’s requirements, and FINRA Regulatory Notice 16-23 ensure oversight of this potential impact to their respective markets.

Time sync is just one of the ways that Microsoft is working with customers to create more secure and compliant applications. We’ll continue to update our security and compliance offerings, so stay tuned for more information. In the meantime, check out our Azure Trust Center to learn more about our security offerings. For detailed instructions on how Microsoft supports time synchronization on Azure, check out the How-To tutorials for Windows and Linux.


Ansible 2.7: What’s new for Azure

$
0
0

Ansible 2.7 will be released 4 October 2018. Here I would like to share with you what are upcoming for Azure in Ansible 2.7. In total, 21 new Azure modules were added. In 2.7 you now have the ability to natively automate the deployment and configuration of below Azure resources.

Azure Web Apps: Create and configure your Azure Web Apps hosting web applications, REST APIs, and mobile backends using Ansible.

Azure Traffic Manager: Create and configure Azure Traffic Manager to distribute traffic optimally to services across global Azure regions using Ansible.

Azure Database: Create and configure an Azure Database for SQL/MySQL/PostgreSQL server using Ansible.

Azure Route: Create and configure your own routes to override Azure's default routing using Ansible.

Azure Applicate Gateway: Create and configure an Azure Application Gateway to manage web traffic using Ansible.

Azure Autoscale: Create and configure Azure autoscale to help applications perform their best when demand changes using Ansible.

Additional facts module for VM and ACR: Get information about virtual machines or Azure container registry for further configuration using Ansible.

Automate Azure Web Apps using Ansible and Jenkins

Azure App Service Web Apps (or just Web Apps) is a service for hosting web applications, REST APIs, and mobile backends. With Ansible 2.7 and Jenkins, you could enjoy seamless CI/CD experience for Azure Web Apps. In below workflow, Jenkin is used to building and deploy, and Ansible is called to handle provisioning. For more details and a tutorial, read Automate Infrastructure and Deployment on Azure using Jenkins and Ansible.

pic2

Manage web traffic with an Azure application gateway using Ansible

With Ansible 2.7, you could create an Azure Application Gateway, a web traffic load balancer that enables you to manage traffic to your web applications. In below workflow, an Azure application gateway is set up to manage web traffic of backend pool that uses two azure container instances. You can find further details and a tutorial at Manage web traffic with an Azure application gateway using Ansible.

pic1

I’m excited about the progress we’re making for Ansible on Azure. Please visit the Ansible on Azure developer hub for more information.

On upcoming AnsibleFest 2018, we will deliver two sessions and demonstrate  Ansible integration with Azure. Welcome join us and drop by the Azure Ansible booth, see a demo, or chat with us about your automation requirement in the cloud.

Analytics will be available on TFS 2019 RC1 – Want to help us test it?

$
0
0
Analytics is the new reporting platform for both Team Foundation Server (TFS) and Azure DevOps. Analytics will be available with TFS 2019 RC1, which should be available later this year. We are looking for TFS customers who are planning to install RC1 when its released and would be willing to participate in a test program... Read More

CUDA 10 is now available, with support for the latest Visual Studio 2017 versions

$
0
0

We are pleased to echo NVIDIA announcement for CUDA 10 today, and particularly excited about CUDA 10.0’s Visual Studio compatibility. CUDA 10.0 will work with all the past and future updates of Visual Studio 2017. To stay committed to our promise for a Pain-free upgrade to any version of Visual Studio 2017, we partnered closely with NVIDIA for the past few months to make sure CUDA users can easily migrate between Visual Studio versions. Congratulations to NVIDIA for this milestone and thank you for a great collaboration!

A Bit of Background

In various updates of Visual Studio 2017 (e.g. 15.5) and even earlier major Visual Studio versions, we discovered that some of the library headers became incompatible with CUDA’s NVCC compiler in 9.x versions. The crux of the problem is about two C++ compilers adding modern C++ standard features at different paces but having to work with a common set of C++ headers (e.g. STL headers). We heard from many of you that this issue is forcing you to stay behind on older versions of Visual Studio. Thank you for that feedback. Together with NVIDIA, we now have a solution that will enable all Visual Studio 2017 update versions to work with CUDA 10.0 tools. This is also reinforced by both sides adding tests and validation processes as part of release quality gates. For example, we now have added (NVIDIA/Cutlass), a CUDA C++ project, into the MSVC Real-World Testing repository as a requirement for shipping. We also have targeted unit tests for PR build validations to guard against potential incompatibility issues.

In closing

We’d love for you to download Visual Studio 2017 version 15.8 and try out all the new C++ features and improvements. As always, we welcome your feedback. We can be reached via the comments below or via email (visualcpp@microsoft.com). If you encounter other problems with MSVC in Visual Studio 2017 please let us know through Help > Report A Problem in the product, or via Developer Community. Let us know your suggestions through UserVoice. You can also find us on Twitter (@VisualC) and Facebook (msftvisualcpp).

Azure Marketplace new offers – Volume 20

$
0
0

We continue to expand the Azure Marketplace ecosystem. From August 16 to 31, 87 new offers successfully met the onboarding criteria and went live. See details of the new offers below:

Virtual machines

128T Networking Platform

128T Networking Platform: The 128T Networking Platform enables organizations to rapidly connect to cloud infrastructure, bringing centralized visibility and management of policies across public and private cloud environments.

Ant Media Server Community Edition 1.4.1

Ant Media Server Community Edition 1.4.1: Publish live streams with WebRTC or Real-Time Messaging Protocol (RTMP). Your live or video-on-demand streams can play anywhere, including on mobile browsers.

Ant Media Server Enterprise Edition

Ant Media Server Enterprise Edition: Ant Media Server Enterprise Edition supports low-latency WebRTC, Real-Time Messaging Protocol (RTMP), MP4, HLS (HTTP Live Streaming), adaptive bitrate, thumbnail generation, and more.

App360 Cloud Management Platform

App360 Cloud Management Platform: After enterprises have virtualized their IT infrastructure environments, the next step is to enable centralized management and interoperability between public clouds and on-premises resources. App360 offers hybrid and multi-cloud management.

Application Load Balancer - ADC

Application Load Balancer / ADC: Our application delivery controller (ADC) load balancer allows you to easily achieve security, traffic management, SSO/pre-authentication, and load balancing.

Azure CycleCloud

Azure CycleCloud: Azure CycleCloud is a free tool for creating, managing, operating, and optimizing HPC clusters in Azure. It is designed to enable enterprise IT organizations to provide secure and flexible cloud HPC and Big Compute environments to their end users.

Baffle Application Data Protection

Baffle Application Data Protection: This solution allows your apps to encrypt data stored on-premises or in cloud databases without code changes. Address data security and privacy while ensuring that third parties never have access to your data without your knowledge.

BigDL - Distributed Deep Learning for Apache Spark

BigDL: Distributed Deep Learning for Apache Spark: BigDL is a deep learning framework (similar to TensorFlow or Caffe) for Apache Spark. To makes it easy to build Spark and BigDL applications, a high-level Analytics Zoo is provided for end-to-end analytics plus AI pipelines.

CIS Red Hat Enterprise Linux 6 Benchmark v2.0.2.0

CIS Red Hat Enterprise Linux 6 Benchmark v2.0.2.0: This instance of RHEL 6 is hardened according to a CIS Benchmark. Launching an image hardened according to the trusted security configuration baselines prescribed by CIS will reduce cost, time, and risk to your organization.

CIS Red Hat Enterprise Linux 7 Benchmark v2.2.0.0

CIS Red Hat Enterprise Linux 7 Benchmark v2.2.0.0: This instance of RHEL 7 is hardened according to a CIS Benchmark. Launching an image hardened according to the trusted security configuration baselines prescribed by CIS will reduce cost, time, and risk to your organization.

Equalum Real-Time Streaming

Equalum Real-Time Streaming: Equalum is an end-to-end stream processing platform that enables companies to perform real-time analytics on data streams and data sets to gain actionable insights.

Full Disk Encryption

Full Disk Encryption: Full Disk Encryption uses software-based encryption and pre-boot authentication to protect data. More than one disk may be encrypted on the same device. All files on a volume are encrypted, including temporary files, swap files, and OS files.

GIMPv26

GIMPv26: GIMP (GNU Image Manipulation Program) is freely distributed and can be used as a simple paint program, an expert-quality photo retouching program, an online batch processing system, a mass-production image renderer, an image format converter, and more.

Global Server Load Balancer (GSLB)

Global Server Load Balancer (GSLB): Global server load balancing provides datacenter failover or enhances performance by directing users to their closest datacenters. GSLB is a DNS-based system that manipulates the DNS response based on datacenter availability and performance.

Kelverion Runbook Studio for Azure Automation v2.1

Kelverion Runbook Studio for Azure Automation v2.1: The Kelverion Runbook Studio provides rich graphical authoring of Azure Automation runbooks without the need for a permanent internet connection or a deep knowledge of PowerShell.

NCache Enterprise 4.9 SP1 Server

NCache Enterprise 4.9 SP1 Server: NCache is a fast and linearly scalable in-memory distributed cache for .NET and Java applications. NCache handles object caching for mission-critical .NET applications with real-time data access needs.

NetFoundry Cloud Gateway

NetFoundry Cloud Gateway: Improve the agility, security, and performance of your cloud applications. Like Azure, we provide the core infrastructure as a fully managed, on-demand service so that you control the networks without having to build or manage them.

NetkaQuartz Service Desk

NetkaQuartz Service Desk: NetkaQuartz Service Desk functionality and features are based on ITIL standards, namely the processes of incident management, service level management, problem management, knowledge management, and configuration management.

NetScaler MA Service Agent 12.1

NetScaler MA Service Agent 12.1: The NetScaler MA Service agent works as an intermediary between the NetScaler Management and Analytics Service and the NetScaler instances within Azure. The agent provides a secure channel for configuration, logs, and telemetry data.

Netsweeper 6.0.6

Netsweeper 6.0.6: Netsweeper 6.0.6 – Get it now.

NVIDIA GPU Cloud Image for Deep Learning and HPC

NVIDIA GPU Cloud Image for Deep Learning and HPC: This is an optimized environment for running the GPU-accelerated containers from the NVIDIA GPU Cloud (NGC) container registry. The deep learning containers from the NGC container registry are tuned, tested, and certified.

Pulse Virtual Web Application Firewall

Pulse Virtual Web Application Firewall: Pulse (previously Brocade) Virtual Web Application Firewall (Pulse vWAF) is a scalable security platform for off-the-shelf solutions and custom applications that envelops applications in their own security perimeter.

SimpleHelp Suite Business Edition

SimpleHelp Suite Business Edition: SimpleHelp allows you to remotely control users, monitor remote systems, and receive alerts when they need maintenance. SimpleHelp is a stand-alone product and integrates with third-party solutions such as Symantec Altiris and Dell KACE.

StreamSets Data Collector for HDInsight

StreamSets Data Collector for HDInsight: StreamSets Data Collector deploys on top of Azure HDInsight and provides a full-featured integrated development environment that lets you design, test, deploy, and more – all without having to write custom code.

tune-one Network Monitoring System

tune-one Network Monitoring System: The tune-one Network Monitoring System is powered by the open-source network monitoring software Nagios Core, and it offers an admin interface. The solution monitors your systems online using Azure.

Web applications

AvePoint Cloud Records for Office 365

AvePoint Cloud Records for Office 365: Apply automated business rules to manage your Office 365 content and physical records — from creation to classification to retention — to achieve compliance without user intervention.

BreakingPoint Cloud Azure DDoS Protection Testing

BreakingPoint Cloud Azure DDoS Protection Testing: BreakingPoint Cloud helps organizations migrating to Azure take a proactive approach to cloud security. With this solution, you can model distributed denial of service attacks in a safe environment.

ClickCompliance Azure

ClickCompliance Azure: This is an integrated system designed to defend corporate integrity when responding to criminal charges. It captures the acknowledgment and acceptance of a code of ethics and other corporate policies.​

Dynamic Signal

Dynamic Signal: Use Azure Active Directory to manage user access, provision user accounts, and enable single sign-on with Dynamic Signal. Requires an existing Dynamic Signal subscription.

GoChain Single Node Blockchain - Developer Edition

GoChain Single Node Blockchain - Developer Edition: This is a fork of the Ethereum blockchain with enhancements to increase transaction throughput. It encompasses a single-node development environment to allow low-cost access to build on GoChain.

HPCBOX Cluster for SU2

HPCBOX Cluster for SU2: HPCBOX provides intelligent workflow capability that lets you plug cloud infrastructure into your application pipeline, giving you fine-grained control of your HPC cloud resources and applications.

Informatica Big Data Management BYOL

Informatica Big Data Management BYOL: Informatica Big Data Management provides data management solutions to quickly and holistically integrate, govern, and secure big data for your business.

iProov Platform

iProov Platform: iProov authenticates users by combining face biometrics and anti-spoofing. The ID Matcher solution is already in production with ING and Rabobank in the Netherlands across millions of devices.

Stratis ICO Platform

Stratis ICO Platform: The Stratis ICO (Initial Coin Offering) Platform allows you to run a secure and flexible web-based application that enables buyers to purchase your tokens before the point of initial allocation.

Striim for Data Integration to SQL Data Warehouse

Striim for Data Integration to SQL Data Warehouse: The Striim platform is an enterprise-grade streaming data integration solution. Striim makes it easy to ingest and process high volumes of streaming data from diverse sources on-premises or in the cloud.

Veeam Backup & Replication 9.5

Veeam Backup & Replication 9.5: With Veeam Backup & Replication 9.5, you can deploy and manage Veeam Agent for Windows and Veeam Agent for Linux on Azure VM instances. Manage and protect your apps in the cloud from one centrally managed console.

Container solutions

Pachyderm

Pachyderm: Teams that find themselves struggling to maintain a growing mess of advanced data science tasks, such as machine learning or bioinformatics/genomics research, use Pachyderm to greatly simplify their system and reduce development time.

Consulting services

1-Day Digital Sales Workshop - Sitecore & Azure

1-Day Digital Sales Workshop | Sitecore & Azure: This workshop will help optimize processes and unlock valuable data to develop new channels of revenue. The data that RDA captures reveals your digital capability, infrastructure, and key opportunities.

App Migration to Azure 10-Day Workshop

App Migration to Azure 10-Day Workshop: With Imaginet’s 10-day app modernization and migration workshop, our Azure professionals will accelerate your modernization and migration efforts while adding in the robustness and scalability of cloud technologies.

Application Migration & Sizing - 1-Week Assessment

Application Migration & Sizing: 1-Week Assessment: We use CloudPilot’s static code analysis to assess your apps and data for migration to containers, VMs, and Platform-as-a-Service in a fraction of the time a manual assessment would take.

AzStudio PaaS Platform - 2-Day Proof of Concept

AzStudio PaaS Platform: 2-Day Proof of Concept: This two-day proof of concept on our AzStudio platform enables a holistic view of cloud-centric design and delivering controls to meet your development and DevOps needs.

Azure Architecture - 2 week Assessment

Azure Architecture: 2 week Assessment: Sign up for this two-week assessment to review your business processes and needs for a new cloud environment designed around your budget, staff, and requirements.

Azure Architecture Design 2-Week Assessment

Azure Architecture Design 2-Week Assessment: In this assessment, Sikich will identify the client’s business requirements and create an Azure architecture design. The design will include a structural design, monthly estimated cost, and estimated installation services.

Azure Cloud Security - 3-Day POC

Azure Cloud Security: 3-Day POC: This proof of concept is for technical leaders and decision-makers. You will work with our experienced engineers to demonstrate the functionality and advantages of Azure Security Center.

Azure Datacenter Migration 2-Week Assessment

Azure Datacenter Migration 2-Week Assessment: After listening to your business requirements and future business goals, Sikich will develop an Azure datacenter migration plan to expand your IT capabilities with hosting on Azure.

Azure Infrastructure as a Service - 3-Day POC

Azure Infrastructure as a Service: 3-Day POC: Technical leaders and decision-makers will work with our experienced Azure architects to run proprietary, non-impactful assessment tools against your infrastructure to realize actual compute, network, and I/O utilization.

Azure Jumpstart - 4-Day Workshop

Azure Jumpstart: 4-Day Workshop: This on-site workshop will involve looking at the client's existing framework; delivering an overview of Microsoft Azure, networking, and virtual machines; setting up the Azure site-to-site VPN or point-to-site VPN with the client; and more.

Azure Migration - 1-Hr Briefing

Azure Migration: 1-Hr Briefing: Considering a migration to Microsoft Azure? Sign up for this one-hour briefing for a high-level overview of the benefits of moving to the cloud and the options you have available on Azure.

Azure Migration Assessment- 1 Day Assessment

Azure Migration Assessment: 1 Day Assessment: This virtual assessment of a server environment will consider the financial impact and technical implications of migrating to Azure.

Azure Modernization - 1-Week (On Prem) Assessment

Azure Modernization: 1-Week (On Prem) Assessment: SphereGen will provide a custom assessment for your organization, outlining an Azure migration plan to access the power of the cloud.

Azure Modernization - 1-Week (Remote) Assessment

Azure Modernization: 1-Week (Remote) Assessment: SphereGen will provide a custom assessment for your organization, outlining an Azure migration plan to access the cloud. Our thorough process analysis will help you make the right technology choices.

Azure Subscription - 2-Wk Assessment

Azure Subscription: 2-Wk Assessment: We assess your Azure subscription and provide recommendations on how to save money. In two weeks, optimize and control Azure resources for budgets, security standards, and regulatory baselines.

Blockchain Executive Workshop - 1-Day

Blockchain Executive Workshop : 1-Day: This workshop targeting senior executives covers blockchain basics and blockchain's applicability to executives’ respective industries.

Cloud Assure Azure Health Check - 3-Day Assessment

Cloud Assure Azure Health Check: 3-Day Assessment: We'll work with you to understand your Azure resource utilization, and we'll make recommendations based on best practices, identifying opportunities for efficiency, scaling, and resiliency.

Cloud Discovery Workshops - 1-2 weeks

Cloud Discovery Workshops: 1-2 weeks: These workshops are designed for companies at the start of their public cloud adoption. The workshops help a customer identify strategy, objectives, technology, target operating model, and migration approach.

Cloud Governance - 3-Wk Assessment

Cloud Governance: 3-Wk Assessment: Within a fixed time and fixed scope engagement, our team of Azure architects will deliver an assessment spanning five key areas of Azure governance.

Cloud HPC Consultation 1-Hr Briefing

Cloud HPC Consultation 1-Hr Briefing: This one-hour custom online briefing is for technical and business leaders to learn how Cloud HPC can benefit their engineering simulations.

Cloud Organisational Readiness - 5-Day Assessment

Cloud Organisational Readiness: 5-Day Assessment: This assessment unites stakeholders to review the platforms, processes, and resources required for a successful transition to the cloud, and it considers the technical and the cultural impact on your organization.

Cloud Standalone - 4-Wk PoC

Cloud Standalone: 4-Wk PoC: Establishing a hybrid cloud foundation infrastructure in Azure requires planning, design, and implementation of core areas. Diaxion can cover all or some of the aspects needed to consume, operate, and manage Azure services.

Cloud Team Managed Services - 4-Wk Implementation

Cloud Team Managed Services: 4-Wk Implementation: This implementation is a 24x7 managed service that includes monitoring, support, security and governance, cost optimization, and everything that is needed for a modern cloud platform.

Cloud Transformation Plan - 3-4 weeks workshop

Cloud Transformation Plan: 3-4 weeks workshop: Once a discovery phase is complete, this process will use the available information to produce a transformation plan describing how the migration or transition to the public cloud can be achieved.

Customer Care Chatbot - 2-Day Proof of Concept

Customer Care Chatbot: 2-Day Proof of Concept: Redefine your customer experience with the Customer Care Chatbot, powered by Azure Bot Services and Azure Cognitive Services. Your 24/7 smart assistant can provide instant and precise information to your customers.

Customer Care Chatbot - 2-Hr Briefing

Customer Care Chatbot: 2-Hr Briefing: This briefing session introduces The Cloud Factory's Customer Care Chatbot, powered by Azure Bot Services and Azure Cognitive Services.

Deploying SCOM - 3-Wk Implementation

Deploying SCOM: 3-Wk Implementation: Do you need help with Microsoft System Center Operations Manager (SCOM) deployment? Infront Consulting Group can coach you on the process and offer remote delivery for deploying SCOM deliverables.

Digital Platform for Contracting - 4-Hr Assessment

Digital Platform for Contracting: 4-Hr Assessment: Vana’s contracting solution for government streamlines the lifecycle from contract initiation to records management, providing a cloud or on-premises platform for agile acquisition.

Ecommerce Sizing Solution - 10-day Implementation

Ecommerce Sizing Solution: 10-day Implementation: Online retailers lose business because shoppers have difficulty visualizing items. Tangiblee helps online visitors better understand product size and fit, leveraging retailers' existing product imagery and data.

Free Azure Optimization Assessment - Australia

Free Azure Optimization Assessment - Australia: In this free assessment, our Azure experts review (no tools) every aspect of your tenant and produce recommendations to improve performance, lower costs, add availability, and improve security.

Free Azure Optimization Assessment - Canada (EN)

Free Azure Optimization Assessment - Canada (EN): In this free assessment, our Azure experts review (no tools) every aspect of your tenant and produce recommendations to improve performance, lower costs, add availability, and improve security.

Free Azure Optimization Assessment - U.K.

Free Azure Optimization Assessment - U.K.: In this free assessment, our Azure experts review (no tools) every aspect of your tenant and produce recommendations to improve performance, lower costs, add availability, and improve security.

Cloud Discovery Workshops - 1-2 weeks

Free Cloud Strategy Foundations: 4-Hour Workshop: Cloud adoption and operation can be complex, but we at CAPSiDE can help you define your cloud strategy, whether you are just starting out, on your way, or fully on board with 16 years of experience.

Application Migration & Sizing - 1-Week Assessment

GDPR of On-Premise Environment: 2-Wk Assessment: Improve your GDPR compliance in two weeks with a data-driven assessment of your IT infrastructure and your cybersecurity and data protection solutions.

Initial 2-Hr Assessment - Software Development

Initial 2-Hr Assessment: Software Development: Digital Mettle has developed solutions for companies of all sizes since 2001. If you have an idea for new software, business process automation, or online customer service, this is the place to start, free of charge.

IoT and Computer Vision with Azure - 4-Day Workshop

IoT and Computer Vision with Azure: 4-Day Workshop: Gain hands-on experience with Azure IoT services and Raspberry Pi by creating a simple home security solution in this four-day IoT workshop.

Kubernetes on Azure 2-Day Workshop

Kubernetes on Azure 2-Day Workshop: Kubernetes is quickly becoming the container orchestration platform of choice for organizations deploying apps in the cloud. Ramp up your development and operations team members with this deeply technical bootcamp.

Lift and Shift-Resource Mgr - 1Day Virtual Workshop

Lift and Shift/Resource Mgr: 1Day Virtual Workshop: This workshop details how to migrate an on-premises procurement system into Azure, map dependencies onto Azure Infrastructure-as-a-Service VMs, and provide end-state design and the steps to get there.

Application Migration & Sizing - 1-Week Assessment

Managed Cost & Security: 2-Wk Assessment: Our managed service ensures you stay in control of all your Azure resources for cost, security, and governance and regulatory compliance (GDPR, PCI, ISO).

Azure Datacenter Migration 2-Week Assessment

Microsoft Azure App Hosting Design 1-Wk Assessment: In this assessment, Sikich will identify the client’s business requirements and create an Azure design that will include a structural design, monthly estimated cost, and estimated installation services.

Migrate Local MS SQL Database To Azure - 4-Wk

Migrate Local MS SQL Database To Azure: 4-Wk: Akvelon’s migration services will migrate your local Microsoft SQL Server workload to Azure SQL Database or an Azure virtual machine in a quick, safe, and cost-effective manner.

Migrate To Azure AD - 3-Wk Implementation

Migrate To Azure AD: 3-Wk Implementation: Akvelon’s migration services will quickly, smoothly, and safely migrate your local Active Directory to Azure Cloud Services.

Migrate Your Websites - 5-Wk Implementation

Migrate Your Websites: 5-Wk Implementation: Akvelon’s migration services will migrate and deploy your web applications to Azure Cloud Services, allowing you to take advantage of its hosting services and scalability.

Orchard CMS consulting 1-week implementation

Orchard CMS consulting 1-week implementation: Get in touch with us for any kind of Orchard consulting: custom development, training, DevOps, or troubleshooting. We focus on web development, training, hosting, and consulting with open Microsoft technologies.

Permissioned Private Blockchain - 4-Wk PoC

Permissioned Private Blockchain - 4-Wk PoC: Understanding blockchain's technical capabilities is a crucial part of solving a business problem. We help you evaluate, ideate, and build blockchain solutions from the blockchain framework to the UI of the product.

Platform Modernization 5-Day Workshop

Platform Modernization 5-Day Workshop: This five-day workshop is for technical and business leaders and is held on-site at the client’s facility.

Rapid Commerce - A Microservices Commerce Platform

Rapid Commerce: A Microservices Commerce Platform: Quickly shift your retail business into high gear with a cloud-based, microservices accelerator and seven-week pilot by Publicis.Sapient.

Customer Care Chatbot - 2-Hr Briefing

Tally on Azure: 5-Day Implementation: Run your on-premises Tally accounting and ERP app from Azure for secure access without server hardware hassles. This lift-and-shift implementation is for technical and business leaders, and is delivered remotely.

Azure Architecture - 2 week Assessment

TFS to VSTS Migration: Set your DevOps team up for success. We’ll help you migrate from Team Foundation Server (TFS) to Visual Studio Team Services (VSTS) in the cloud.

Zero Dollar Down SAP Migration 1-Day Assessment

Zero Dollar Down SAP Migration 1-Day Assessment: In this assessment, we analyze your cloud infrastructure needs, SAP software and data upgrade requirements, and regulatory compliance. We then help you choose the right deployment model in Azure.

Cooling down storage costs in the healthcare AI blueprint

$
0
0

Artificial Intelligence (AI) and Machine Learning (ML) are transforming healthcare. From streamlining operations to aiding in clinical diagnosis. Healthcare organizations are often challenged to begin an AI/ML journey due to lack of experience or high cost.

The Azure Healthcare AI blueprint installs a HIPAA and HITRUST compliant environment in Azure for managing and running healthcare AI experiments. This provides a quick start to your AI/ML efforts and can get technical staff proficient with a reference implementation very quickly and with little cost.

Since it is a reference implementation, you must consider the ongoing costs to maintain the blueprint infrastructure in production. One place to look for easy savings is in storage. In this entry, we’ll discuss features of Azure Blob Storage, and practices to lower the cost of blob storage.

image

The case for more storage: AI and cognitive services

The blueprint is designed to ease the learning and implementation of AI/ML in a healthcare organization. A “patient length of stay” experiment is included which uses .csv files that take up little room. But consider other data that could be used for machine learning. These include radiology (x-ray) and MRI data along with other radiological images. And as AI services become part of the mainstream, you could also be storing video and audio files — because cognitive services can transcribe audio or tag photographs. In sum, as the capabilities of AI grows, the need for blob storage can expand dramatically.

Required storage

Often, a workload in Azure starts with saving a file into blob storage, because it is important to keep a copy of the data as the “source of truth data” for long periods of time, up to several years to comply with retention regulations or policies. The storage of the original data means any operations done with the data should be repeatable.

When data is placed into blob storage it is considered “hot” stored data, which is available anytime you need it. Data is retrieved immediately upon request.

Once data in blob storage has been used — say for an ML Studio experiment — it may not be necessary to hold it in hot storage. In cases where the data needs to be kept but is not accessed very often, there are ways to lower the costs.

Using tiered Blob Storage

Blob Storage has three data tiers, each based on how often the data is retrieved, and each with a different cost.

  • If data is accessed frequently, it is considered to be in hot storage. This is the most accessible, yet expensive, Blob Storage option.
  • If the data is accessed less frequently and has a lifespan of fewer than 30 days, it is a prime candidate to move into cool storage, which allows quick and easy access to the data, like hot storage, but is not expected to be accessed often.
  • Archive storage tier is optimized for scenarios in which the data is held for more than 180 days and is not expected to be immediately available upon query. It can take several hours to retrieve data from archive storage.

Data may also be purged on a given time interval. For example, it may be necessary to keep healthcare diagnostic data for a mandated period of time. When that time runs out, however, the data should be removed from storage.

Managing data between tiers

Different data types in healthcare need to be stored for different lengths of time and may have different lifecycles as it moved from hot to cool or archival storage. Azure Blob Storage lifecycle management offers a rule-based policy which can be used to transition data to the most appropriate access tier and to expire data at the end of its lifecycle.

Through lifecycle management, rules may be created to perform several actions on blobs individually.

  • Transition blobs to a cooler storage tier
  • Optimize for performance and cost
  • Delete blobs at the end of their lifecycles
  • Define rules to be executed at set intervals at the storage account level

These capabilities can save the healthcare organization money when using blob storage, as the blueprint does. When building a larger solution using the blueprint, costs of data storage may go up, requiring attention be paid to the access tier being used.

Recommended next steps

In addition to the Azure portal, there are several other options for programmatically accessing and managing Blob Storage including PowerShell, and CLI tools and .NET, Java, Python, and Node.js client libraries all support Blob-level tiering and archive storage.

To understand the data ingestion and storage model of the Healthcare AI blueprint, install it and consider the pricing models of Blob Storage based on the idea that patient length of stay is affected by admissions and discharges for any given day.

Find even more system optimizations and have a successful install of the Healthcare AI blueprint by reading the Implementing the Azure blueprint for AI article. This article takes you through the blueprint and highlights areas where special attention should be paid, similar to which Blob Storage tier you might use!

Collaboration

Your comments and recommendations are welcome below. I regularly post on technology in healthcare topics. Reach out and connect with me on LinkedIn or Twitter.

World-class PyTorch support on Azure

$
0
0

Today we are excited to strengthen our commitment to supporting PyTorch as a first-class framework on Azure, with exciting new capabilities in our Azure Machine Learning public preview refresh. In addition, our PyTorch support extends deeply across many of our AI Platform services and tooling, which we will highlight below.

During the past two years since PyTorch's first release in October 2016, we've witnessed the rapid and organic adoption of the deep learning framework among academia, industry, and the AI community at large. While PyTorch's Python-first integration and imperative style have long made the framework a hit among researchers, the latest PyTorch 1.0 release brings the production-level readiness and scalability needed to make it a true end-to-end deep learning platform, from prototyping to production.

Four ways to use PyTorch on Azure

Azure Machine Learning service

Azure Machine Learning (Azure ML) service is a cloud-based service that enables data scientists to carry out end-to-end machine learning workflows, from data preparation and training to model management and deployment. Using the service's rich Python SDK, you can train, hyperparameter tune, and deploy your PyTorch models with ease from any Python development environment, such as Jupyter notebooks or code editors. With Azure ML's deep learning training features, you can seamlessly move from training PyTorch models on your local machine to scaling out to the Azure cloud.

You can install the Python SDK with this simple line.

pip install azureml-sdk

To use the Azure ML service, you will need an Azure account and access to an Azure subscription. The first thing needed is to create an Azure Machine Learning workspace. A workspace is a centralized resource that manages your other Azure resources and training runs.

ws = Workspace.create(name='my-workspace',
                       subscription_id='<azure-subscription-id>',
                       resource_group='my-resource-group')

To fully take advantage of PyTorch, you will need access to at least one GPU for training, and a multi-node cluster for more complex models and larger datasets. Using the Python SDK, you can easily take advantage of Azure compute for single-node and distributed PyTorch training. The below code snippet will create an autoscaled, GPU-enabled Azure Batch AI cluster.

provisioning_config = BatchAiCompute.provisioning_configuration(vm_size="STANDARD_NC6",
                                           autoscale_enabled=True,
                     cluster_min_nodes=0,
                     cluster_max_nodes=4)
compute_target = ComputeTarget.create(ws, provisioning_config)

From here, all you need is your training script. The following example will create an Azure ML experiment to track your runs and execute a distributed 4-node PyTorch training run.

experiment = Experiment(ws, "my pytorch experiment")

pt_estimator = PyTorch(source_directory='./my-training-files',
                  compute_target=compute_target,
                  entry_script='train.py',
                  node_count=4,
                  process_count_per_node=1,
                  distributed_backend='mpi',
                  use_gpu=True)

experiment.submit(pt_estimator)

To further optimize your model's performance, Azure ML provides hyperparameter tuning capabilities to sweep through various hyperparameter combinations. Azure ML will schedule your training jobs, early terminate poor-performing runs, and provide rich visualizations of the tuning process.

# parameter sampling
param_sampling = RandomParameterSampling(
      "learning_rate": loguniform(-10, -3),
      "momentum": uniform(0.9, 0.99)
)

# early termination policy
policy = BanditPolicy(slack_factor=0.15, evaluation_interval=1)

# hyperparameter tuning configuration
hd_run_config = HyperDriveRunConfig(estimator=my_estimator,
                           hyperparameter_sampling=param_sampling,
                           policy=policy,
                           primary_metric_name='accuracy',   
                           primary_metric_goal=PrimaryMetricGoal.MINIMIZE,
                           max_total_runs=100,
                           max_concurrent_runs=10,
)
experiment.submit(hd_run_config)

hyperdrive

Finally, data scientists and engineers use the Python SDK to deploy their trained PyTorch models to Azure Container Instances or Azure Kubernetes Service.

You can check out a comprehensive overview of Azure Machine Learning's full suite of offerings as well as access complete tutorials on training and deploying PyTorch models in Azure ML.

Data Science Virtual Machine

Azure also provides the Data Science Virtual Machine (DSVM), a customized VM specifically dedicated for data science experimentation. The DSVM comes preconfigured and preinstalled with a comprehensive set of popular data science and deep learning tools, including PyTorch. The DSVM is a great option if you want a frictionless development experience when building models with PyTorch. To utilize the full features of PyTorch, you can use a GPU-based DSVM, which comes pre-installed with the necessary GPU drivers and GPU version of PyTorch. The DSVM is pre-installed with the latest stable PyTorch 0.4.1 release, and it can easily be upgraded to the PyTorch 1.0 preview.

Azure Notebooks

Azure Notebooks is a free, cloud-hosted Jupyter Notebooks solution that you can use for interactive coding in your browser. We preinstalled PyTorch on the Azure Notebooks container, so you can start experimenting with PyTorch without having to install the framework or run your own notebook server locally. In addition, we provide a maintained library of the official, up-to-date PyTorch tutorials on Azure Notebooks. Learning or getting started with PyTorch is as easy as creating your Azure account and cloning the tutorial notebooks into your own library.

1977

Visual Studio Code Tools for AI

Visual Studio Code (VS Code) is a popular and lightweight source code editor. VS Code Tools for AI is a cross-platform extension that provides deep learning and AI experimentation features for data scientists and developers using the IDE. Tools for AI is tightly integrated with the Azure Machine Learning service, so you can submit PyTorch jobs to Azure compute, track experiment runs, and deploy your trained models all from within VS Code.

vscodetoolsforai

The Tools for AI extension also provides a rich set of syntax highlighting, automatic code completion via IntelliSense, and built-in documentation search for PyTorch APIs.

vscodetoolsforai2

Microsoft contributions to PyTorch

Finally, we are excited to join the amazing community of PyTorch developers in contributing innovations and enhancements to the PyTorch platform. Some of our immediate planned contributions are improving PyTorch data loading and processing, which includes improving performance, data reading support for Hidden Markov Model Toolkit- (HTK) defined formats for speech datasets, and a data loader for Azure Blob Storage. In addition, we are working to achieve complete parity for PyTorch Windows support, and full ONNX coverage adhering to the ONNX standard. We will also be working closely with Microsoft Research (MSR) on incorporating MSR innovations into PyTorch.

At Microsoft we are dedicated to embracing and contributing to the advancements of such open-source AI technologies. Just a year ago we began our collaboration with Facebook on the Open Neural Network Exchange (ONNX), which promotes framework interoperability and shared optimization of neural networks through the ONNX model representation standard. With our commitment to PyTorch on Azure, we will continue to expand our ecosystem of AI offerings to enable richer development in research and industry.

IoT solutions for manufacturing: build or buy?

$
0
0

If you are a manufacturer who wants to take its first steps towards IoT, and you’re overwhelmed by the plethora of vendors and IoT platforms in the IoT space, you are not alone. IoT is still a new space, with many moving parts and products. This makes it hard for organizations to know exactly where and how to get started. In this blog, I will try to provide you a simplified overview and next steps, based on the conversations I have been having with many manufacturing organizations.

Components of an IoT solution

When it comes to deciding whether to build or buy your IoT solution it is important, of course, to understand exactly what you are building or buying. To that end, it helps to identify the main components of an IoT solution stack (Figure 1). From bottom to top:

IoT Solutions

Figure 1: Building Blocks of an IoT Solution

1. Cloud platform: a set of general-purpose PaaS services used by developers to develop cloud-based solutions. These services include messaging, storage, compute, security, and more. Cloud platforms (such as Microsoft Azure) also include analytics services and IoT services.

2. IoT platform: A set of IoT-specific PaaS and SaaS services and development tools to facilitate the development of IoT solutions. They are often developed on top of a general-purpose cloud platform, which takes care of the scalability, reliability, performance, and security. IoT platforms offer additional value compared to general-purpose cloud platforms that could accelerate your time-to-value. This is often thanks to the fact that the companies that develop these IoT platforms usually have extensive experience in manufacturing.

3. An IoT solution is an end-to-end software system that spans from the physical connection to the devices, thru data ingestion, storage, and transformation, analytics, to the user interface.

4. IoT applications are the user-facing components of the IoT solution. Users interact to visualize and manipulate the IoT data, extract insights, and make decisions, or execute actions based on these insights. There are usually multiple IoT applications for a given IoT solution, each tailored to specific users and use cases. IoT applications can come in a variety of shapes and forms.

For example, Figure 2 (below) is a browser-based dashboard that shows the health of a customer’s engines in all its airplanes, highlighting any alerts so they can zoom in to the specific engine to see what’s going on.

Figure 2

Figure 2

Figure 3 is a mobile app that shows the expected remaining life of a tool in a machine.

Figure 3

Figure 3

Figure 4 shows an augmented reality app that overlays information in a worksite to help service technicians perform maintenance.

Figure 4

Figure 4

(Examples of IoT applications are courtesy of Rolls Royce, Sandvik Coromant, and Thyssenkrupp Elevator.)

Build vs. buy considerations

When companies set out to develop their IoT solutions, they have two options:

  1. Buy an IoT platform from a vendor to develop their solutions on top of it.
  2. Build the solution directly on top of a cloud platform, leveraging its IoT and analytics services.

In both cases above, companies may choose to build their IoT solution alone, or with help from consulting services or a systems integrator.

Notice that we do not mention the option of buying an IoT solution. This is because it is not realistic to expect that a software vendor will deliver a solution that will be used by its customers as-is, right off the shelf. Each company has its unique requirements, factory environment, and use cases. Therefore, most customers will build their own solutions or extend solution accelerators and templates offered by IoT platforms.

Also, we do not discuss the option of building your own cloud platform. Most companies will leverage an existing cloud platform from a cloud vendor like Microsoft rather than developing their own. Building a cloud platform is likely to be an impractical proposition for most.

So, the options for those developing an IoT solution effectively boil down to the two listed above and reproduced below, along with their pros and cons:

Option Pros Cons
Buy an IoT platform from a vendor
  • Faster time-to-value
  • More expensive
  • May offer limited or no choice of underlying cloud platform
Build on a cloud platform
  • More flexibility
  • More control
  • Less expensive
  • May leverage your company’s existing cloud subscription
  • Requires cloud development skills

Table 1: Pros and Cons of Build vs. Buy Options for IoT Solutions

Conclusion

There is much more to discuss build vs. buy choices when it comes to IoT solutions. For more help and resources, including a solution guide, podcasts, webinars, and partner and customer highlights, see the Azure for manufacturing use cases page.


Enabling real-time data warehousing with Azure SQL Data Warehouse

$
0
0

Gaining insights rapidly from data is critical to competitiveness in today’s business world. Azure SQL Data Warehouse (SQL DW), Microsoft’s fully managed analytics platform leverages Massively Parallel Processing (MPP) to run complex interactive SQL queries at every level of scale.

Users today expect data within minutes, a departure from traditional analytics systems which used to operate on data latency of a single day or more. With the requirement for faster data, users need ways of moving data from source systems into their analytical stores in a simple, quick, and transparent fashion. In order to deliver on modern analytics strategies, it is necessary that users are acting on current information. This means that users must enable the continuous movement from enterprise data, from on-premise to cloud and everything in-between.

SQL Data Warehouse is happy to announce that Striim now fully supports SQL Data Warehouse as a target for Striim for Azure. Striim enables continuous non-intrusive performant ingestion of all your enterprise data from a variety of sources in real time. This means that users can use intelligent pipelines for change data capture from sources such as Oracle Exadata straight into SQL Data Warehouse. Striim can also be used to move fast moving data landing in your data lake into SQL Data Warehouse with advanced functionality such as on-the-fly transformation and model based scoring with Azure Databricks.

striim_Final_Logo_rgb_lightbg_header

"Enterprises adopting cloud-based analytics need to ensure reliable, real-time and continuous data delivery from on-prem and cloud-based data sources to reduce decision latencies inherent in batch based analytics. Striim's solution for SQL Data Warehouse is offered in the Azure marketplace, and can help our customers quickly ingest, transform, and mask real time data from transactional systems or Kafka into SQL Data Warehouse to support both operational and analytics workloads". 

– Alok Pareek, Founder and EVP of Products for Striim.

Via in-line transformations, including denormalization, before delivering to Azure SQL Data Warehouse, Striim reduces on-premises ETL workload as well as data latency. Striim enables fast data loading to Azure SQL DW through optimized interfaces such as streaming (JDBC) or batching (PolyBase). Azure customers can store the data in the right format, and provide full context for any downstream operations, such as reporting and analytical applications.

Arch

Next steps

To learn more about how you can build a modern data warehouse using Azure SQL Data Warehouse and Striim, watch this video, schedule a demo with a Striim technologist, or get started now on the Azure Marketplace.

Learn more about SQL DW and stay up-to-date with the latest news by following us on Twitter @AzureSQLDW.

Eight use cases for machine learning in insurance

$
0
0

Insurance companies that sell life, health, and property and casualty insurance are using machine learning (ML) to drive improvements in customer service, fraud detection, and operational efficiency. For example, the Azure cloud is helping insurance brands save time and effort using machine learning to assess damage in accidents, identify anomalies in billing, and more.

image

Here are some common use cases for ML in insurance, along with resources for getting started with ML in Azure.

Eight ML use cases to improve service, optimization, automation, and scale

  1. Lapse management: Identifies policies that are likely to lapse, and how to approach the insured about maintaining the policy.
  2. Recommendation engine: Given similar customers, discovers where individual insureds may have too much, or too little, insurance. Then, proactively help them get the right insurance for their current situation.
  3. Assessor assistant: Once a car has been towed to a body shop, use computer vision to help the assessor identify issues which need to be fixed. This helps accuracy, speeds an assessment, and keeps the customer informed with any repairs.
  4. Property analysis: Given images of a property, identifies structures on the property and any condition issues. Insurers can proactively help customers schedule repairs by identifying issues in their roofs, or suggest other coverage when new structures, like a swimming pool, are installed.
  5. Fraud detection: Identifies claims which are potentially fraudulent.
  6. Personalized offers: Improves the customer experience by offering relevant information about the coverage the insured may need based on life events, such as the birth of a child, purchase of a home or car.
  7. Experience studies: Uses unsupervised machine learning to discover predictors in claims activity. This information can help set assumptions and feed into activities such as pricing models, risk analyses, and other actuarial analyses.
  8. Scaled training: Use Azure to train your models using GPUs or thousands of CPU cores.

All of these use cases can be addressed machine learning.

Machine learning on Azure

Machine learning enables computers to learn from data through techniques that are not explicitly programmed. Insurers use artificial intelligence (AI) applications to intelligently process and act on data. Through this effort, organizations achieve more through increased speed and efficiency. Get started with these resources:

Recommended next steps

Additional resources

Cloud Scale Analytics meets Office 365 data – empowered by Azure Data Factory

$
0
0

Office 365 holds a wealth of information about how people work and how they interact and collaborate with each other, and this valuable asset enables intelligent applications to derive value and to optimize organizational productivity. Today application developers use Microsoft Graph API to access Office 365 in a transactional way. This approach however is not efficient if you need to analyze over large amount of Office artifacts across a long time horizon. Further, Office 365 data is isolated from other business data and systems, leading to data silos and untapped opportunity for additional insights.

 

Azure offers a rich set of hyperscale analytics services with enterprise-grade security and are available in data centers worldwide. By marrying Office 365 data and Azure, Office 365 data can be available in Azure and developers can harness the full power of Azure to build highly scalable and secure applications against the combination of Office 365 data and other business data.

 

This week at Ignite we announced the Public Preview of Microsoft Graph data connect, which enables secured, governed, and scalable access of Office 365 data in Azure. With this offering, for the very first time, all your data – organizational, customer, transactional, external – can come together for innovative analytics and insights, in a way that was not possible before.

 

Integrate Office 365 data at scale in Azure using Azure Data Factory

Azure Data Factory (ADF) is a managed data integration service that allows you to bring together diverse data sources in Azure and build operationalized ETL flows. With this added ability to bulk ingest Office 365 data, ADF’s collection of 70+ on-prem and cloud data connectors just got richer and more comprehensive.

Using visual tools in ADF, you can easily author a pipeline that does a one-time or scheduled ingestion of the Microsoft 365 dataset of interest. As of today, the available datasets include: Email messages, calendar events, personal contacts, mail folders, and mailbox settings. More types of data from Microsoft 365 will be added over time.  For each dataset, you can choose which column you wish to ingest, which groups of users to include, and set a time-based row filter.

Office 365 data access through ADF offers many additional benefits including:

  • Once data has landed in Azure, you can use Azure Databricks to prepare, transform, and further enrich the data with machine learning. ADF provides integration with Azure Databricks to execute Databricks Notebooks, Jars, and Python scripts.
  • You can bring in additional data sources to integrate with the Office 365 dataset, for example you can combine customer sales and marketing data from Salesforce, SAP, and Dynamics 365 together with email exchanges and meeting events in Office 365 in order to keep track of your sales pipeline and predict customer propensity.
  • You can publish curated and analyzed results into Azure Cosmos DB for consumption in geo-distributed applications. ADF provides efficient data loading into Azure Cosmos DB through the bulk executor library.
  • ADF uses Service Principals for secure service-to-service authentication. Securely manage the key in Azure Key Vault (AKV) and ADF integrates with AKV to retrieve the key during runtime.
  • ADF provides iterative development and debugging, including ability to export to/import from ARM templates.
  • ADF enables CICD through integration with VSTS Git and Github.

View and approve data access requests using Privileged Access Management

Office 365 privileged access management goes beyond traditional access control capabilities by enabling access granularity for specific data pipelines. It provides just-enough-access to the developer for their use, scoped down to the specific properties in the dataset they need.  The customer’s Office 365 administrators can view all data users along with what access they have. Administrators may also exclude certain sensitive users.

If you develop with Azure Managed Applications, the Office 365 administrators will also be able to see what policies your application complies with.

Package and deploy your application using Azure managed applications

Azure managed applications enable you to offer cloud solutions that are easy for customers to deploy and operate and are easy for you to manage. You can publish to your organization’s service catalog for internal deployment or to the Azure Marketplace to be sold to external customers. The customer will not have access to the code or operations of your application, leaving this to be managed by you. Coming soon, you may also choose to not be provided standing access to the application as a security assurance to your sensitive customers.

Finally, you may opt in to certain policies being enforced by Azure managed applications over your application. These policies are continuously verified and represented to the customer’s Office 365 administrator. If a policy is violated, the Data Factory pipeline fails.

ISV Testimonial

For the last few months, we have been working with ISVs who find the value propositions offered by Microsoft Graph data connect really appealing. A great example of these early adopters is Harmon.ie.

Harmon.ie is a long-standing Microsoft partner who focuses on delivering people-centric applications for Office 365 and SharePoint. One of their solutions, Harmon.ie 10, is a next-generation application that allows information workers to view key topics in a graph view and easily navigate documents and calendar events by topic.  Below is a quote from Harmoni.ie on how they have benefited from Microsoft Graph data connect:

“We have been working on harmon.ie 10 for a couple of years with a public cloud provider and the two main issues were:

  • How to retrieve all an organization email - it’s a massive amount of data and the traditional APIs method didn’t scale, and
  • How to protect our customers’ data privacy - machine learning on all the organization’s emails in the public cloud would be unsavory.

Migrating our code into an Azure Data Factory pipeline allowed us to simplify our source code and to provide our solution as a Managed App that runs in the customer Azure subscription without pulling the data away from the customer trusted data boundaries.

For us, this was a game changer.”   Yehonathan Sharvit, Vice President Research and Development, Harmon.ie

Get Started Today!

Read more about Office 365 connector in ADF and follow this tutorial to build pipelines against Office 365 data using Azure Data Factory.

We cannot wait to see what you build!  If you have any feature requests or want to provide feedback, please visit the Azure Data Factory forum.

.NET Core October 2018 Update

$
0
0

.NET Core October 2018 Update

Today, we are releasing the .NET Core September 2018 Update. This update includes .NET Core 2.1.5 and .NET Core SDK 2.1.403 and contains important reliability fixes.

Getting the Update

The latest .NET Core updates are available on the .NET Core download page. This update is also included in the Visual Studio 15.8.6 update, which is also releasing today.

See the .NET Core 2.1.5 release notes for details on the release including a detailed commit list.

Docker Images

.NET Docker images have been updated for today’s release. The following repos have been updated.

Note: Look at the “Tags” view in each repository to see the updated Docker image tags.

Note: You must re-pull base images in order to get updates. The Docker client does not pull updates automatically.

Lifecycle Updates

The end of life schedule for .NET Core 2.0 was updated and announced in June and that day has arrived. .NET Core 2.0 was released August 14, 2017 and really began to open the path to the capabilities envisioned for .NET Core. Instructions for upgrading from .NET Core 2.0 to .NET Core 2.1 can be found in the following documents:

.NET Core 2.1 has been declared the long-term support (LTS) release. We recommend that you make .NET Core 2.1 your new standard for .NET Core development.

Azure App Services

Deployment of .NET Core 2.1.5 to Azure App Services has begun and the West Central US region will be live this morning. Remaining regions will be updated over the next few days and deployment progress can be tracked in this Azure App Service announcement.

App Services and end of life .NET Core versions

We are working through a maintenance plan for the versions of .NET Core and the .NET Core SDK which will be available on Azure App Services. Currently the available versions include some which are quite old and need to be removed. The essential idea will be to keep the latest patch version of in-support version channels (eg 1.0, 2.1, etc). Because the SDK is capable of building applications for any version of the runtime, only the latest will be retained.

.NET Core 2.0 reached end of life on October 1, 2018 and is no longer eligible for support or updates which means that it should be removed from App Services. However, we understand that many applications on App Services use 2.0 and removing it from the service too quickly would be disruptive. To give ample opportunity to migrate applications, we are going to ‘attach’ the 2.0 App Services maintenance to the .NET Core 1.0 and 1.1 end of life schedule, which concludes June 27, 2019 per the .NET Core Support Policy. After that date, .NET Core 1.0, 1.1 and 2.0 will be removed from Azure App Services.

Previous .NET Core Updates

The last few .NET Core updates follow:

September 2018 Update
August 2018 Update
July 2018 Update
June 2018 Update
May 2018 Update

Blazor 0.6.0 experimental release now available

$
0
0

Blazor 0.6.0 is now available! This release includes new features for authoring templated components and enables using server-side Blazor with the Azure SignalR Service. We're also excited to announce our plans to ship the server-side Blazor model as Razor Components in .NET Core 3.0!

Here's what's new in the Blazor 0.6.0 release:

  • Templated components
    • Define components with one or more template parameters
    • Specify template arguments using child elements
    • Generic typed components with type inference
    • Razor templates
  • Refactored server-side Blazor startup code to support the Azure SignalR Service

A full list of the changes in this release can be found in the Blazor 0.6.0 release notes.

Get Blazor 0.6.0

Install the following:

  1. .NET Core 2.1 SDK (2.1.402 or later).
  2. Visual Studio 2017 (15.8 or later) with the ASP.NET and web development workload selected.
  3. The latest Blazor Language Services extension from the Visual Studio Marketplace.
  4. The Blazor templates on the command-line:

    dotnet new -i Microsoft.AspNetCore.Blazor.Templates
    

You can find getting started instructions, docs, and tutorials for Blazor at https://blazor.net.

Upgrade an existing project to Blazor 0.6.0

To upgrade a Blazor 0.5.x project to 0.6.0:

  • Install the prerequisites listed above.
  • Update the Blazor package and .NET CLI tool references to 0.6.0. The upgraded Blazor project file should look like this:

    <Project Sdk="Microsoft.NET.Sdk.Web">
    
    <PropertyGroup>
        <TargetFramework>netstandard2.0</TargetFramework>
        <RunCommand>dotnet</RunCommand>
        <RunArguments>blazor serve</RunArguments>
        <LangVersion>7.3</LangVersion>
    </PropertyGroup>
    
    <ItemGroup>
        <PackageReference Include="Microsoft.AspNetCore.Blazor.Browser" Version="0.6.0" />
        <PackageReference Include="Microsoft.AspNetCore.Blazor.Build" Version="0.6.0" />
        <DotNetCliToolReference Include="Microsoft.AspNetCore.Blazor.Cli" Version="0.6.0" />
    </ItemGroup>
    
    </Project>
    

That's it! You're now ready to try out the latest Blazor features.

Templated components

Blazor 0.6.0 adds support for templated components. Templated components are components that accept one or more UI templates as parameters, which can then be used as part of the component's rendering logic. Templated components allow you to author higher-level components that are more reusable than what was possible before. For example, a list view component could allow the user to specify a template for rending items in the list, or a grid component could allow the user to specify templates for the grid header and for each row.

Template parameters

A templated component is defined by specifying one or more component parameters of type RenderFragment or RenderFragment<T>. A render fragment represents a segment of UI that is rendered by the component. A render fragment optionally take a parameter that can be specified when the render fragment is invoked.

TemplatedTable.cshtml

@typeparam TItem

<table>
    <thead>
        <tr>@TableHeader</tr>
    </thead>
    <tbody>
    @foreach (var item in Items)
    {
        @RowTemplate(item)
    }
    </tbody>
    <tfoot>
        <tr>@TableFooter</tr>
    </tfoot>
</table>

@functions {
    [Parameter] RenderFragment TableHeader { get; set; }
    [Parameter] RenderFragment<TItem> RowTemplate { get; set; }
    [Parameter] RenderFragment TableFooter { get; set; }
    [Parameter] IReadOnlyList<TItem> Items { get; set; }
}

When using a templated component, the template parameters can be specified using child elements that match the names of the parameters.

<TemplatedTable Items="@pets">
    <TableHeader>
        <th>ID</th>
        <th>Name</th>
        <th>Species</th>
    </TableHeader>
    <RowTemplate>
        <td>@context.PetId</td>
        <td>@context.Name</td>
        <td>@context.Species</td>
    </RowTemplate>
</TemplatedTable>

Template context parameters

Component arguments of type RenderFragment<T> passed as elements have an implicit parameter named context, but you can change the parameter name using the Context attribute on the child element.

<TemplatedTable Items="@pets">
    <TableHeader>
        <th>ID</th>
        <th>Name</th>
        <th>Species</th>
    </TableHeader>
    <RowTemplate Context="pet">
        <td>@pet.PetId</td>
        <td>@pet.Name</td>
        <td>@pet.Species</td>
    </RowTemplate>
</TemplatedTable>

Alternatively, you can specify the Context attribute on the component element (e.g., <TemplatedTable Context="pet">). The specified Context attribute applies to all specified template parameters. This can be useful when you want to specify the content parameter name for implicit child content (without any wrapping child element).

Generic-typed components

Templated components are often generically typed. For example, a generic ListView component could be used to render IEnumerable<T> values. To define a generic component use the new @typeparam directive to specify type parameters.

GenericComponent.cshtml

@typeparam TItem

@foreach (var item in Items)
{
    @ItemTemplate(item)
}

@functions {
    [Parameter] RenderFragment<TItem> ItemTemplate { get; set; }
    [Parameter] IReadOnlyList<TItem> Items { get; set; }
}

When using generic-typed components the type parameter will be inferred if possible. Otherwise, it must be explicitly specified using an attribute that matches the name of the type parameter:

<GenericComponent Items="@pets" TItem="Pet">
    ...
</GenericComponent>

Razor templates

Render fragments can be defined using Razor template syntax. Razor templates are a way to define a UI snippet. They look like the following:

@<tag>...<tag>

You can now use Razor templates to define RenderFragment and RenderFragment<T> values like this:

@{ 
    RenderFragment template = @<p>The time is @DateTime.Now.</p>;
    RenderFragment<Pet> petTemplate = (pet) => @<p>Your pet's name is @pet.Name.</p>
}

Render fragments defined using Razor templates can be passed as arguments to templated components or rendered directly. For example, you can render the previous templates directly like this:

@template

@petTemplate(new Pet { Name = "Fido" })

Use server-side Blazor with the Azure SignalR Service

In the previous Blazor release we added support for running Blazor on the server where UI interactions and DOM updates are handled over a SignalR connection. In this release we refactored the server-side Blazor support to enable using server-side Blazor with the Azure SignalR Service. The Azure SignalR Service handles connection scale out for SignalR based apps, scaling up to handle thousands of persistent connections so that you don't have to.

To use the Azure SignalR Service with a server-side Blazor application:

  1. Create a new server-side Blazor app.

    dotnet new blazorserverside -o BlazorServerSideApp1
    
  2. Add the Azure SignalR Server SDK to the server project.

    dotnet add BlazorServerSideApp1/BlazorServerSideApp1.Server package Microsoft.Azure.SignalR
    
  3. Create an Azure SignalR Service resource for your app and copy the primary connection string.

  4. Add a UserSecretsId property to the BlazorServerSideApp1.Server.csproj project file.

    <PropertyGroup>
        <UserSecretsId>BlazorServerSideApp1.Server.UserSecretsId</UserSecretsId>
    <PropertyGroup>
    
  5. Configure the connection string as a user secret for your app.

    dotnet user-secret -p BlazorServerSideApp1/BlazorServerSideApp1.Server set Azure:SignalR:ConnectionString <Your-Connection-String>
    

    NOTE: When deploying the app you'll need to configure the Azure SignalR Service connection string in the target environment. For example, in Azure App Service configure the connection string using an app setting.

  6. In the Startup class for the server project, replace the call to app.UseServerSideBlazor<App.Startup>() with the following code:

    app.UseAzureSignalR(route => route.MapHub<BlazorHub>(BlazorHub.DefaultPath));
    app.UseBlazor<App.Startup>();
    
  7. Run the app.

    If you look at the network trace for the app in the browser dev tools you see that the SignalR traffic is now being routed through the Azure SignalR Service. Congratulations!

Razor Components to ship with ASP.NET Core in .NET Core 3.0

We announced last month at .NET Conf that we've decided to move forward with shipping the Blazor server-side model as part of ASP.NET Core in .NET Core 3.0. About half of Blazor users have indicated they would use the Blazor server-side model, and shipping it in .NET Core 3.0 will make it available for production use. As part of integrating the Blazor component model into the ASP.NET Core we've decided to give it a new name to differentiate it from the ability to run .NET in the browser: Razor Components. We are now working towards shipping Razor Components and the editing in .NET Core 3.0. This includes integrating Razor Components into ASP.NET Core so that it can be used from MVC. We expect to have a preview of this support early next year after the ASP.NET Core 2.2 release has wrapped up.

Our primary goal remains to ship support for running Blazor client-side in the browser. Work on running Blazor client-side on WebAssembly will continue in parallel with the Razor Components work, although it will remain experimental for a while longer while we work through the issues of running .NET on WebAssembly. We will however keep the component model the same regardless of whether you are running on the server or the client. You can switch your Blazor app to run on the client or the server by changing a single line of code. See the Blazor .NET Conf talk to see this in action and to learn more about our plans for Razor Components:

Give feedback

We hope you enjoy this latest preview release of Blazor. As with previous releases, your feedback is important to us. If you run into issues or have questions while trying out Blazor, file issues on GitHub. You can also chat with us and the Blazor community on Gitter if you get stuck or to share how Blazor is working for you. After you've tried out Blazor for a while please let us know what you think by taking our in-product survey. Click the survey link shown on the app home page when running one of the Blazor project templates:

Blazor survey

Thanks for trying out Blazor!

AI, Machine Learning and Data Science Announcements from Microsoft Ignite

$
0
0

Microsoft Ignite, Microsoft's annual developer conference, wrapped up last week and many of the big announcements focused on artificial intelligence and machine learning. The keynote presentation from Microsoft's Cloud AI lead Eric Boyd showcases the major developments, or you can check out his accompanying blog post for a written summary. 

In this post, I'll dive down into the details of the specific announcements and provide links to the associated recorded presentations from Ignite and other technical documents.

Tools for AI developers

Azure Machine Learning Service (Azure ML) adds several new capabilities. These include a new Python package that developers can use to prepare data for analysis, automate machine learning and deep learning model training, track model performance under various conditions, and automate the model selection process (see below). It supports a variety of execution environments, from local desktops to GPU clusters, and deploy trained models with containers or even high-performance FPGA chips.  (The desktop "Workbench" client, previously in preview, is no longer required and has been deprecated.) Watch the Ignite presentation AI with Azure Machine Learning services: Simplifying the data science process for an overview and demo.

Automated Machine Learning, a new Azure ML service in preview, automates the selection of machine learning algorithms (and associated parameters and hyperparameters) to optimize predictive performance. This blog post provides the technical details, and another blog post describes some practical applications. A Microsoft Research paper describes the probabilistic matrix factorization technique behind the optimization. This Ignite presentation provides further details and includes a demo.  

Visual Studio Code Tools for AI has been updated to provide a convenient interface to Azure ML for users of the popular open-source editor.

SQL Server 2019 is now in public preview, with many improvements shown in this Ignite keynote. Updates include Spark support, and added Linux support for in-database R and Python (watch this Ignite presentation for details).

Azure Databricks is now supported in more regions, offers GPU support for deep learning, and Databricks Delta is now available (in preview) for transactional data capabilities.

Customized AI

Microsoft Bot Framework SDK v4 is now available. The Ignite presentation "Creating Enterprise-Scale Intelligent Agents and Bots" provides an overview and several examples.

Cortana Skills Kit for Enterprise, a development platform based on Azure Bot Service and Language Understanding is now in private preview.  

Pre-trained AI

Speech Service in Azure Cognitive Services is now generally available, and includes a new neural text-to-speech capability for humanlike synthesized speech. This Ignite presentation provides an overview.

This Ignite presentation provides an overview of Cognitive Services with some demos with applications of the pre-trained AI capabilities for computer vision, speech, search and more.

Programs

AI for Humanitarian Action announced. With this new AI for Good program, MIcrosoft will dedicate $40m over five years to deploy AI technologies to assist with disaster recovery; the needs of children, refugees and displaced people; and human rights. A short video summarizes the initiative.

And that's just the highlights — for example, I haven't included the many AI capabilities added to products like Windows and Office. (However I can't resist mentioning the new Insert Data from Picture feature in Excel, which I'm sure I'll be making a lot of use of.) For more from Ignite, take a look at Buck Woody's index of presentations at Ignite, with highlights of videos focused on AI, machine learning, and data science.

New inking and 3D updates bring presentation design to the next level—this and more coming to Office in October

$
0
0

For the last four years, we’ve been on a mission to transform Office and use artificial intelligence (AI) to make everyday tasks easier. In June, we revealed a fresh, new design for our Office apps. And just a few days ago, we announced new AI-powered features in PowerPoint and Excel.

Today, we’re making it even easier to showcase your ideas, stay organized, and create surveys and polls with the following updates:

  • With AI-powered inking and 3D updates in Word and PowerPoint, you can use a touch-enabled device and digital pen to ink your ideas and transform them into perfectly formatted content. You can also choose from 30 new 3D models with built-in animations to bring your content to life.
  • Two new updates in Outlook.com make staying on top of tasks easier and interacting with your favorite brands simpler. New integration with Microsoft To-Do helps you manage your tasks without leaving your inbox. We’re also launching a new experience to help you easily interact with the brands you love in Outlook.com.
  • To help you keep track of tasks using whatever mode is most comfortable to you, we’re adding the ability to update your tasks using ink. With your digital pen and a touch-enabled Windows device, simply add a task to your list using ink, and then strike out when complete.
  • Since releasing Microsoft Forms for education and commercial organizations, millions of people have used Forms to create surveys and quizzes. Today, the Forms Public Preview is available to our consumer customers as well.

These updates are built to help you save time and stay organized. Check out this post to learn more.

An animated image highlights new inking features in PowerPoint.

The post New inking and 3D updates bring presentation design to the next level—this and more coming to Office in October appeared first on Microsoft 365 Blog.


Azure.Source – Volume 51

$
0
0

A Microsoft Ignite 2018 retrospective

As you probably know by now, many thousands of people descended on Orlando, Florida last week for Microsoft Ignite 2018. In this volume of Azure.Source, I've gathered in one place the links you need to dig through all of the announcements we made. As you will read and see below, a LOT happened last week.

Microsoft Ignite 2018 - Vision keynote - Microsoft CEO, Satya Nadella, kicks off Ignite and Envision 2018 with this keynote that lays out the Microsoft vision that will empower every person and every organization on the planet to achieve more.

IT and developer success with Microsoft Azure - Microsoft Azure gives you the freedom to build, manage, and deploy cloud native and hybrid applications on a massive, global cloud using your favorite tools and frameworks. Join Scott Guthrie, EVP Cloud + AI & Julia White, CVP Azure, as they demonstrate the latest advances.

Microsoft Ignite | The Tour - If you missed out on attending Ignite in Orlando, don't fret - Ignite may be coming to a city near you over the coming months. Join us at the place where developers and tech professionals continue learning alongside experts. Explore the latest developer tools and cloud technologies and learn how to put your skills to work in new areas. Connect with our community to gain practical insights and best practices on the future of cloud development, data, IT, and business intelligence.

Microsoft Ignite 2018 news and highlights - In September, we did some work on the Updates area of the Azure website to make it easier to track changes to Azure services and the roadmap showing what's in development. This section of Updates provides a concise directory of what we announced at Ignite.

Microsoft Learn - A new way to learn Azure by following guided learning paths to build practical job skills you can start using right away: Azure Fundamentals, Administer infrastructure resources in Azure, and more.

Microsoft Learn illustration

Infrastructure

A crazy amount of new Azure infrastructure announcements - Core Sanders, Corporate Vice President, Azure, covers a plethora of announcements in this blog post, which is a companion post to his general session at Ignite:

Get the details:

Azure Friday | Introducing the Azure Data Box family - Mathew Dickson joins Donovan Brown to discuss the Data Box family of solutions to meet the challenge of moving data to the cloud. Azure Data Box offline devices help you transfer large amounts of data to Azure when the network isn't an option. Data Box online products act as network storage gateways to intelligently manage data between your site and Azure.

Azure Friday | Azure Front Door Service (Preview) - Sharad Agrawal joins Lara Rubbelke to discuss the recently announced Azure Front Door Service, which takes your application availability global and real-time while maximizing performance by connecting you to your users using Microsoft's massive global edge network. In this session, you'll learn how to set up a Front Door for your application and the performance and reliability benefits you can achieve.

Data

Ignite 2018 - Making AI real for your business with Azure Data - Rohan Kumar, Corporate Vice President, Azure Data, shares how the confluence of Cloud, Data, and Artificial Intelligence (AI) is driving unprecedented change and is rapidly becoming foundational to innovation in every industry, which is a companion post to his general session at Ignite:

Get the details:

Azure SQL Data Warehouse

Redefine data analytics with Modern Data Warehouse on Azure - John Macintyre Partner Group Program Manager, Azure SQL Data Warehouse, demonstrates why thousands of customers are now using Azure SQL Data Warehouse to take advantage of the fast, flexible, and secure analytics platform to gain deeper insights and make better decisions.

See also:

Azure Cosmos DB

Azure Cosmos DB - database for Intelligent Cloud - Intelligent Edge era - Rimma Nehme Product Manager and Architect, Azure Cosmos DB, announces new capabilities that serve as a crucial step towards enabling anyone to easily build globally distributed apps for the Intelligent-Cloud-Intelligent Edge era.

See also:

Azure HDInsight

Deep dive into Azure HDInsight 4.0 - Ashish Thapliyal Principal Program Manager, Azure HDInsight, announces HDInsight 4.0, which brings the latest Apache Hadoop 3.0 innovations representing over 5 years of work from the open source community and our partner Hortonworks across key Apache frameworks to solve ever-growing big data and advanced analytics challenges.

See also:

Artificial Intelligence

Azure AI – Making AI real for business - Eric Boyd Corporate Vice President, Azure AI, announces a range of innovations across Machine Learning, Cognitive Services, and knowledge mining to make Azure the best place for AI. This blog post is a companion post to his general session at Ignite:

Get the details:

Management

New full stack monitoring capabilities in Azure Monitor - Shiva Sivakumar, Director of Program Management, Azure Monitoring & Diagnostics, announces that Azure Log Analytics and Azure Application Insights are now available as integrated features within Azure Monitor, without any loss or compromise of any capability, and with the same pricing.

Get the details:

See also:

Internet of Things

Microsoft Azure enables a new wave of edge computing. Here’s how. - Julia White, Corporate Vice President, Microsoft Azure, shares how we’re building Azure cloud/edge capabilities aligned with these enduring principles while uniquely delivering consistency across the cloud and the edge.

Get the details:

Azure Friday | Introducing Azure Data Box Edge - Andrew Mason joins Donovan Brown to introduce Data Box Gateway and Data Box Edge, two new data network transfer solutions from Azure. They both enable easy data transfer to the cloud, with Data Box Edge providing pre-processing for fast, local results or to modify the data before upload.

The IoT Show | Deploying and Managing IoT From an Ops Perspective - Shot live at Microsoft Ignite 2018.

The IoT Show | Routing Messages in Azure IoT Hub based on Device Twin - Check out this great demo of the new way to configure message routing in Azure IoT Hub using Device Twin properties and the updated UI in the Azure portal. Paul Montgomery, Senior Software Engineer in the Azure IoT Team shows us how this all works using an ESP32 based microcontroller, a temperature sensor, a glass of hot water and another glass of iced water.

The IoT Show | First Anniversary for Azure IoT Hub Device Provisioning Service - IoT Hub Device Provisioning Service helps customers provision and register IoT devices at scale in a secure manner. To celebrate the service anniversary, Nicole Berdy, PM in the Azure IoT team shows us the latest additions to the service to make it even easier to provision IoT devices.

The IoT Show | Azure Sphere Overview - What is Azure Sphere? Let's ask Ed Nightingale, Partner Architect for the new Microsoft solution for creating highly-secured, connected MCU powered devices.

The IoT Show | Azure IoT Central is now Generally Available - The Microsoft SaaS offer for IoT, Azure IoT Central, is now generally available. This means you can now take it to production. That also means you get a load of new features enabling IoT scenarios at scale such as the support for automatically provisioning devices, the ability to setup advanced analytics rules, continuously export devices data and meta data to a blob storage, a simple per-device cost, a totally free trial and much more. Marcello Majonchi and Larry Jordan from the IoT Central team visited the IoT Show to tell us all about these new features.

The IoT Show | Dial up your MXChip Azure IoT starter kit - Connect an Azure IoT DevKit to the Azure IoT Hub and view sensor readings in a custom web application and see how Azure IoT Hub is able to process messages in real-time from tangible devices. The sample also covers communicating to the device via Direct Methods, setting up a rules engine.

Networking

Azure Networking Fall 2018 update - Yousef Khalidi CVP, Azure Networking, announces 100 Gbps, fastest connectivity in public cloud and availability of branch connectivity, new cloud native security capabilities and application performance services.

Get the details:

The Azure Podcast

The Azure Podcast | Episode 248 - Updates from Ignite 2018 - A whole bunch of Azure updates were announced at Ignite so Cynthia, Cale and Sujit try to cover as much as possible in 30 minutes!

Now in preview

Azure Container Registry: Public preview of Helm Chart Repositories and more - As customers deploy multi-container applications, Helm has evolved as the defacto standard to describe Kubernetes-based applications. With Helm repositories, customers can push their Helm Charts to Azure Container Registry, providing a single source of truth for their images and deployment definitions running in Kubernetes.

Announcing the public preview of Shared Image Gallery - Shared Image Gallery provides an Azure-based solution to make the custom management of virtual machine (VM) images easier in Azure.

Announcing private preview of Azure VM Image Builder - Azure Virtual Machine (VM) Image Builder, now available in private preview, enables you to migrate your image building pipeline to Azure.

Azure Active Directory authentication for Azure Files SMB access now in public preview - Integration with Azure AD enables SMB access to Azure file shares using Azure AD credentials from Azure AD DS domain joined Windows VMs. In addition, Azure Files supports preserving, inheriting, and enforcing Microsoft file system NTFS ACLs on all folders and files in a file share.

Introducing Azure Premium Blob Storage (limited public preview) - Azure Premium Blob Storage complements the existing Hot, Cool, and Archive tiers by storing data on solid-state drives, which are known for lower latency and higher transactional rates compared to traditional hard drives.

Announcing Webapps Undelete (Preview) - Undelete a deleted web app to restore a site deleted in the past 30 days, including the content and configuration, and it will also restore the host name, if available.

Microsoft 365 adds modern desktop on Azure - A cloud-based service that delivers a multi-user Windows 10 experience, optimized for Office 365 ProPlus, and includes free Windows 7 Extended Security Updates.

A new era for Azure Files: Bigger, faster, better! - Get larger, low latency file shares with more IOPS, higher throughput, and Azure Active Directory integration. In addition, a new Azure File Sync agent is coming with significant improvements in the sync performance, tiering, and reporting.

Diagram showing Azure Files scale and performance

Premium Files pushes Azure Files limits by 100x! - Azure Premium Files provides fully managed file services, optimized to deliver consistent performance at 100x improvement over the existing Azure Files. It is designed for IO intensive enterprise workloads that require high throughput and a single digit millisecond latency.

Announcing Bring your own Storage to App Service - A new feature to App Service on Linux and Web App for Containers that supports mounting Azure Blobs and Azure Files into your Azure App Service. You can configure up to five Azure storage accounts for a given App Service.

Now generally available

Azure SignalR Service now generally available - Get a fully-managed SignalR service that enables you to focus on building real-time web experiences without worrying about setting up, hosting, scaling, or load balancing your SignalR server.

General availability of Tomcat and Java SE on App Service on Linux - Using App Service on Linux, you can deploy your Java project to either Tomcat or your favorite Java SE platform, such as Spring.

Serial console for Azure VMs now generally available - Serial console for Azure VMs is now generally available in all public regions. New features include magic SysRq keys, non-maskable interrupts, and subscription-wide enable/disable.

Azure Friday | Troubleshoot and diagnose Azure Virtual Machines with Serial Console - Alfred Sin joins Donovan Brown to discuss the Azure Serial Console, which is now generally available in all public clouds. Serial console enables you to use an interactive shell in situations where you may be unable to SSH or RDP into your VM, which make it super-easy for troubleshooting and self-serve diagnosing of your VM.

Introducing Azure Functions 2.0 - In addition to UX updates, now you can use your cross-platform .NET Core assets within your Functions apps. At the core of the Azure Functions runtime is the function host, re-written to run in .NET Core 2.1. This not only brings significant performance improvements, but also allows Functions to be written and hosted locally on all major platforms—Windows, Mac, and Linux.

News and updates

Announcing the New App Service Diagnostics Experience - App Service Diagnostics is our intelligent and interactive experience to help you diagnose and troubleshoot issues with your app. You can use Genie to guide you through the different ways to troubleshoot a variety of potential issues, since sometimes bad things happen to good apps.

Data-driven styling and more in the latest Azure Maps Web SDK update - The latest update to the Azure Maps Web SDK is now available, which includes developer-focused API improvements, a spatial math library, and more.

SAP at Microsoft Ignite: Announcing SAP Data Custodian for Azure - SAP Data Custodian, a SaaS offering from SAP, will soon become available on Azure. SAP Data Custodian combines Azure’s built-in compliance controls and the deep expertise of SAP to provide customers end-to-end visibility of their SAP data on Azure and an easy to use set of data governance controls. This will help customers to fulfill their responsibility for data governance, segregation of oversight duties, and achieve independent verification.

Building with blockchain on Azure - Provides an update on companies building with blockchain on Azure and the Azure Blockchain Workbench, which simplifies development and eases experimentation with prebuilt networks and infrastructure.

Azure Pipelines is the CI/CD solution for any language, any platform, any cloud - Azure Pipelines is a CI/CD service that enables you to continuously build, test, and deploy to any platform or cloud. It has cloud-hosted agents for Linux, macOS, and Windows, powerful workflows with native container support, and flexible deployments to Kubernetes, VMs, and serverless environments. Azure Pipelines provides a great offer for Open Source Projects by providing unlimited build minutes and 10 free parallel jobs, and integrates well with GitHub repos

Go Cloud-native with Spring Cloud on Azure - To streamlined your efforts for running Spring apps in the cloud, we announced the Spring Cloud for Azure project. In addition, with the new Spring Boot Extension Pack in Visual Studio Code, you can have a lightweight development experience with your Spring Boot applications.

Manage Azure Monitor using Java - We released version 1.16 of the Azure Management Libraries for Java. Now you can programmatically manage Azure Monitor using Azure Management Libraries for Java.

Azure Quickstart Center update - Azure Quickstart Center helps you start your projects in Azure quickly with step-by-step guidance. New enhancements to the Azure Quickstart Center experience help you set up your Azure environment with best practices and guidance from Azure.

Launching Quickstart Center from the global search box

New features and enhancements released in Bing Custom Search V3 - We released additions and enhancements to the Bing Custom Search product to help developers and non-developers to harness the power of the web by enabling them to build an even more powerful and customizable search experience.

Red Hat OpenShift and Microsoft Azure Stack together for hybrid enterprise solutions - OpenShift and Azure Stack enable a consistent application experience across Azure, Azure Stack, bare-metal, Windows and RHEL bringing together Microsoft’s and Red Hat’s developer frameworks and partner ecosystems.

Move Managed Disks and VMs now available - Move Managed Disks and VMs across resource groups and subscriptions with a single click. This also enables you to move Managed Images and Snapshots.

Azure Service Fabric updates at Ignite 2018 - Service Fabric is a microservices platform to build, deploy, discover, and scale services with message routing, low-latency storage, and health monitoring. Announces an update of Service Fabric Mesh preview and the pending release of Service Fabric runtime version 6.4 with corresponding SDK and tooling updates.

Strengthen security with key Azure innovations - We announced new capabilities and services at Ignite that enable you to manage your identity without a password, added new layers to network protection, protecting data wherever it is to enable confidential computing, and more.

Microsoft Azure: The only consistent, comprehensive hybrid cloud - More than 95 percent of Fortune 500 companies trust their business on Azure today, and many of them take advantage of Azure hybrid capabilities to fuel innovation and deliver great business outcomes. Learn how customers are taking advantage of these capabilities and how we're supporting your hybrid needs with the ultimate consistent hybrid cloud.

The Azure DevOps Podcast

The Azure DevOps Podcast | Sam Guckenheimer on Testing, Data Collection, and the State of DevOps Report - Episode 003 - Sam Guckenheimer explains the exciting new offer around Azure Pipelines for open source teams, changes he has seen in the industry from his many years of working at Microsoft, and some of the biggest changes in how users work with Azure DevOps.

Technical content and training

The OWASP DevSlop Show - Learn about DevSecOps, which weaves security through DevOps on the OWASP DevSlop Show, which streams live every Sunday at 1:00 PM ET, on Mixer, Twitch, and YouTube, for approximately an hour. Past episodes are available on YouTube.

Azure Functions and Azure Storage: secure authentication with Managed Identities and without managing keys! - Learn about using Managed Identities for secure and convenient approach to authenticate to Azure services from Serverless Functions without managing or storing any access keys or credentials.

How to securely access cloud resources from Linux VMs on Azure - Learn how to significantly improve authentication process between cloud resources, reduce operational overhead and vulnerability of managing and storing authentication credentials.

Broadcast Real-time Updates from Cosmos DB with SignalR Service and Azure Functions - Learn how you can use Azure Functions and SignalR Service to broadcast real-time document changes in Cosmos DB to clients over WebSockets.

Azure tips & tricks

TLGLBbv3HoA

How to test web applications in production - Learn how to test web applications for production using Azure App Service. Using this testing in production feature, makes it easier for you to carry out AB testing with different versions of your application.

hBl_2iaqIDs

How to use the Azure Resource Explorer - Learn how to use the Azure Resource Explorer to quickly explore REST APIs. The Azure Resource Explorer is not only an interesting way to explore your resources, but it also allows you to see the commands you can invoke by a PowerShell or the Azure CLI.

Events

What to expect in Spark + AI summit Europe - The Spark + AI summit Europe is in London this week where Microsoft and many of their customers using Azure Databricks are present during the Summit.

Customers and partners

Microsoft and Azul Systems bring free Java LTS support to Azure - Java developers on Azure and Azure Stack can build and run production Java applications using Azul Systems Zulu Enterprise builds of OpenJDK without incurring additional support costs. If you’re currently running Java apps on-premise or with other JDKs, consider moving it to Zulu on Azure for free support and maintenance.

Healthcare Cloud Security Stack now available on Azure Marketplace - Healthcare Cloud Security Stack which is now available on Azure Marketplace, uses Qualys Vulnerability Management and Cloud Agents, Trend Micro Deep Security, and XentIT Executive Dashboard as a unified cloud threat management solution. Qualys cloud agents continuously collect vulnerability information and are mapped to Trend Micro Deep Security IPS.

Azure ISVs expand the possibilities with Azure Stack - This post highlights some of our ISV partners that address common customer requirements for Azure Stack.

A Cloud Guru's Azure This Week

J0pfXurePaU

Azure This Week | Ignite Special - 25 September 2018 - Lars Klint discusses the general availability of Azure Functions 2.0, enhancements to Azure Logic Apps and the public preview of Azure Blueprints.

Azure This Week | Ignite Special - 26 September 2018 - Lars Klint discusses Azure Digital Twins, Azure Sphere and enhancements to Azure IoT platform. He also speaks with Burke Holland and Brian Clark from Microsoft.

GHJOwPQl9B8

mAs86qpBwBc

Azure This Week | Ignite Special - 27 September 2018 - Lars Klint talks about AI for Humanitarian Action, new Machine Learning capabilities and the new Speech Service which is in general availability. He also grabs interviews with the CEO of Loftysoft, Magnus Mårtensson and Chief Architect and Trainer, Scott Duffy.

Start developing on Windows 10 October 2018 Update today

$
0
0

We’re excited to announce that the Windows 10 October 2018 Update (build 17763) and SDK is now available. You may also know this as Windows 10, version 1809.

New APIs and Features for developers

Every update of Windows 10 is loaded with new APIs but don’t worry, Windows Dev Center has a full list of what is new for developers. Here are a few of my favorites.

  1. WinUI Library: This is the way to get new controls! We are shipping new controls out-of-band such as Menu Bar and Navigation View. Head to https://www.nuget.org/packages/Microsoft.UI.Xaml for the latest package.
  2. Density UI improvements: We’re making the default spacing and sizing about 20% tighter compared to the old standard.
  3. MSIX packaging tool: The new packaging tool also makes upgrading existing applications a breeze to the new packaging format. It works both in an interactive UI or commandline! If you want to learn more about MSIX, head over to Docs.

Update your dev environment in two simple steps

  1. Update your system to Windows 10 October 2018 Update.
  2. Get Visual Studio 2017 with the updated tooling and Windows 10 SDK.

How to update your device to Windows 10 October 2018 Update

There are multiple ways you can update, but the easiest way is to go to Windows Update in your settings and click “check for updates.” It’s that simple!

Acquire the Windows 10 SDK and Visual Studio 2017

Now that your system is on Windows 10 October 2018 Update, install Visual Studio and the new SDK.

  • Currently with Visual Studio 2017, version 15.8:
  1. Run the installer or go to https://www.visualstudio.com/downloads/ and download it.
  2. Go to “Individual Components”
  3. Go to “SDKs, libraries, and frameworks” section
  4. Check “Windows 10 SDK (10.0.17763)”
  5. Click “Install”
  • In the near future with Visual Studio 2017, version 15.9:
  1. Run the installer or go to https://www.visualstudio.com/downloads/ and download it.
  2. Select “Universal Windows Platform development” under Workloads, Windows 10 SDK (10.0.17763) will be included by default
  3. Click “Install.”

More useful tips

Do you want tools for C++ desktop or game development for UWP? Be sure one of these two are selected:

  • C++ Universal Windows Platform tools in the UWP Workload section
    • Desktop development with C++ Workload and the Windows SDK 10 (10.0.17763)
  • If you want the Universal Windows Platform tools:
    • Select the Universal Windows Platform tools workload.

Once your systems are updated and recompiled and your app is tested, submit your app to Dev Center.

Your take on the Update

Tell us what crazy things you’ve been working on with the new Update by tweeting @WindowsDev or @ClintRutkas.

The post Start developing on Windows 10 October 2018 Update today appeared first on Windows Developer Blog.

Spark + AI Summit Europe – Developing from cloud to the edge

$
0
0

Organizations around the world are gearing up for a future powered by data, cloud, and Artificial Intelligence (AI). This week at Spark + AI Summit Europe, I talked about how Microsoft is committed to delivering cutting-edge innovations that help our customers navigate these technological and business shifts.

The driving force behind powerful AI applications is data – and getting the most out of AI requires a modern data estate. Organizations are using their data to extract important insights to drive their businesses forward and engage their customers in ways that were simply not possible before. One such example is the Real Madrid Football Club, one of the world’s top sports franchises with 500 million fans worldwide. Real Madrid built a global digital sports platform to engage one-on-one with fans, implement personalized promotional campaigns, and use data to track and analyze fan behaviors, among many other capabilities. This data-driven strategy has led to a 400 percent increase in installed fan base, and a 30 percent increase in digital revenue growth for the club.

“We used to pull data from just five sources before, but now we pull from more than 70 sources using our Microsoft Azure platform. This has enabled us to grow our number of fan profiles by about 400 percent in the past two years—it’s now in the millions.”

- Begoña Sanz, Commercial General Manager, Real Madrid

The influx of data powered AI opportunities is not only limited to the intelligent cloud, but we are also seeing a foundational shift in the computing paradigm to the intelligent edge. The intelligent edge enables you to package your applications into containers and deploy them onto the devices themselves, delivering the intelligence closer to the source of data. As we announced last week, we are continuing to invest in the intelligent edge with tools and services like Azure Sphere, IoT Edge, and Azure Stack, because we fundamentally believe that the impact of data and AI is made even more profound with the power of the intelligent cloud and the intelligent edge. Customers like BMW are revolutionizing their customer experience with connected cars. DunavNET has brought the power of digital transformation to farmers to help them generate better quality food, higher yields, and increased profits. Rolls Royce, a leading manufacturer of aircraft engines, is using the intelligent edge to analyze data remotely and deliver real time actionable insights to airlines about engine performance for predictive maintenance. These are just a few examples of customers leveraging their data, wherever it exists, to turn it into breakthrough insights.

To enable developers and data scientists to build data and AI solutions, this year in March we announced the general availability of Azure Databricks, a fast, easy and collaborative Apache® Spark™ based analytics service on Azure. Since then we are seeing a huge momentum as customers continue to leverage the power of Spark to build machine learning and deep learning models. To build on the momentum, last week at Microsoft Ignite conference we announced the preview of Delta, service availability in 9 additional regions, and other exciting features that you can learn more about in the blog.

Azure Databricks along with the intelligent edge is enabling customers like Shell to build machine vision and AI solution at their gas station retail locations. This enables them to detect unsafe actions in near real time and alert staff so that they can intervene.

“We’re taking advantage of the Azure Databricks shared environment—we’ve made it our preferred collaboration platform, and it’s helping our data scientists and engineers share more and get to the next level of AI sophistication. Because Azure Databricks is highly elastic, we get really powerful spin up/spin down capabilities, and our developers love its neat, elegant user interface.”

- Daniel Jeavons, General Manager for Data Science, Shell

Azure Databricks has also enabled customers like LINX Cargo Care, a national supply chain and logistics provider, to transform their businesses. LINX was able to analyze customer payment behavior to better understand what was paid, owed, the gap, and payment history, revealing patterns or trends that can help LINX improve customer service and close the payment gaps.

“By using Azure Databricks analytics to understand payment patterns, our customer service reps will be able to make targeted, proactive collection phone calls... We expect to recover 90 percent of outstanding funds by their due dates.”

- Thomas Gianniodis, General Manager IT, LINX CC Group

Regardless of where you want to develop your big data solutions, Microsoft offers the most comprehensive data platform to make AI real for your business. Whether you want to use Spark on SQL Server 2019, build deep learning and AI models in the cloud with Azure Databricks, or deploy them at the edge with Azure Machine Learning service, we have you covered.

At Microsoft, we are committed to delivering AI into the hands of every developer and data scientist so they can unleash the power of data and reimagine possibilities that will improve our world.

Get started today!

We are excited for you to try Azure Databricks! Get started today and let us know your feedback.

Headless CMS and Decoupled CMS in .NET Core

$
0
0

Headless by Wendy used under CC https://flic.kr/p/HkESxWI'm sure I'll miss some, so if I do, please sound off in the comments and I'll update this post over the next week or so!

Lately I've been noticing a lot of "Headless" CMSs (Content Management System). A ton, in fact. I wanted to explore this concept and see if it's a fad or if it's really something useful.

Given the rise of clean RESTful APIs has come the rise of Headless CMS systems. We've all evaluated CMS systems (ones that included both front- and back-ends) and found the front-end wanting. Perhaps it lacks flexibility OR it's way too flexible and overwhelming. In fact, when I wrote my podcast website I considered a CMS but decided it felt too heavy for just a small site.

A Headless CMS is a back-end only content management system (CMS) built from the ground up as a content repository that makes content accessible via a RESTful API for display on any device.

I could start with a database but what if I started with a CMS that was just a backend - a headless CMS. I'll handle the front end, and it'll handle the persistence.

Here's what I found when exploring .NET Core-based Headless CMSs. One thing worth noting, is that given Docker containers and the ease with which we can deploy hybrid systems, some of these solutions have .NET Core front-ends and "who cares, it returns JSON" for the back-end!

Lynicon

Lyncicon is literally implemented as a NuGet Library! It stores its data as structured JSON. It's built on top of ASP.NET Core and uses MVC concepts and architecture.

It does include a front-end for administration but it's not required. It will return HTML or JSON depending on what HTTP headers are sent in. This means you can easily use it as the back-end for your Angular or existing SPA apps.

Lyncion is largely open source at https://github.com/jamesej/lyniconanc. If you want to take it to the next level there's a small fee that gives you updated searching, publishing, and caching modules.

ButterCMS

ButterCMS is an API-based CMS that seamlessly integrates with ASP.NET applications. It has an SDK that drops into ASP.NET Core and also returns data as JSON. Pulling the data out and showing it in a few is easy.

public class CaseStudyController : Controller
{
    private ButterCMSClient Client;
    private static string _apiToken = "";
    public CaseStudyController()
    {
        Client = new ButterCMSClient(_apiToken);
    }
    [Route("customers/{slug}")]
    public async Task<ActionResult> ShowCaseStudy(string slug)
    {
      butterClient.ListPageAsync()
        var json = await Client.ListPageAsync("customer_case_study", slug)
        dynamic page = ((dynamic)JsonConvert.DeserializeObject(json)).data.fields;
        ViewBag.SeoTitle = page.seo_title;
        ViewBag.FacebookTitle = page.facebook_open_graph_title;
        ViewBag.Headline = page.headline;
        ViewBag.CustomerLogo = page.customer_logo;
        ViewBag.Testimonial = page.testimonial;
        return View("Location");
    } 
}

Then of course output into Razor (or putting all of this into a RazorPage) is simple:

<html>
  <head>
    <title>@ViewBag.SeoTitle</title>
    <meta property="og:title" content="@ViewBag.FacebookTitle" /> 
  </head>
  <body>
    <h1>@ViewBag.Headline</h1>
    <img width="100%" src="@ViewBag.CustomerLogo">
    <p>@ViewBag.Testimonial</p>
  </body>
</html>

Butter is a little different (and somewhat unusual) in that their backend API is a SaaS (Software as a Service) and they host it. They then have SDKs for lots of platforms including .NET Core. The backend is not open source while the front-end is https://github.com/ButterCMS/buttercms-csharp.

Piranha CMS

Piranha CMS is built on ASP.NET Core and is open source on GitHub. It's also totally package-based using NuGet and can be easily started up with a dotnet new template like this:

dotnet new -i Piranha.BasicWeb.CSharp
dotnet new piranha
dotnet restore
dotnet run

It even includes a new Blog template that includes Bootstrap 4.0 and is all set for customization. It does include optional lightweight front-end but you can use those as guidelines to create your own client code. One nice touch is that Piranha also images image resizing and cropping.

Umbraco Headless

The main ASP.NET website currently uses Umbraco as its CMS. Umbraco is a well-known open source CMS that will soon include a Headless option for more flexibility. The open source code for Umbraco is up here https://github.com/umbraco.

Orchard Core

Orchard is a CMS with a very strong community and fantastic documentation. Orchard Core is a redevelopment of Orchard using open source ASP.NET Core. While it's not "headless" it is using a Decoupled Architecture. Nothing would prevent you from removing the UI and presenting the content with your own front-end. It's also cross-platform and container friendly.

Squidex

"Squidex is an open source headless CMS and content management hub. In contrast to a traditional CMS Squidex provides a rich API with OData filter and Swagger definitions." Squidex is build with ASP.NET Core and the CQRS pattern and works with both Windows and Linux on today's browsers.

Squidex is open source with excellent docs at https://docs.squidex.io. Docs are at https://docs.squidex.io. They are also working on a hosted version you can play with here https://cloud.squidex.io. Samples on how to consume it are here https://github.com/Squidex/squidex-samples.

The consumption is super clean:

[Route("/{slug},{id}/")]
public async Task<IActionResult> Post(string slug, string id)
{
    var post = await apiClient.GetBlogPostAsync(id);
    var vm = new PostVM
    {
        Post = post
    };
    return View(vm);
}

And then the View:

@model PostVM
@{
    ViewData["Title"] = Model.Post.Data.Title;
}
<div>
    <h2>@Model.Post.Data.Title</h2>
    @Html.Raw(Model.Post.Data.Text)
</div>

What .NET Core Headless CMSs did I miss? Let me know.

*Photo "headless" by Wendy used under CC https://flic.kr/p/HkESxW


Sponsor: Telerik DevCraftTelerik DevCraft is the comprehensive suite of .NET and JavaScript components and productivity tools developers use to build high-performant, modern web, mobile, desktop apps and chatbots. Try it!



© 2018 Scott Hanselman. All rights reserved.
     

Fuel My Awesome: Celebrating developers and what makes them awesome

$
0
0


FuelMyAwesome

#FuelMyAwesome is back to celebrate developers like you, and we want to hear about all the things that make you tick and keep you inspired. Whether that’s a lucky beanie or a cold brew, a delightful desk toy or a fun way to get fit, we want you to share it with us for a chance to win cool swag.
 
Starting October 8, 2018, at 9:00 AM Pacific Time, @msdev will tweet weekly questions to uncover how you get in the zone, celebrate wins, recharge your batteries, and more. Reply directly to the weekly Monday tweets – including the hashtags #FuelMyAwesome and #sweepstakes – by Thursday at 11:59 PM Pacific Time to be entered for a chance to win*.

And remember to check back every Monday for a brand-new question and another chance to win.

Ready to start celebrating you? We are – because you are AWESOME.

FAQs

How do I enter?

  1. Visit twitter.com/msdev.
  2. Find the current week’s #FuelMyAwesome tweet that is pinned to the top of the timeline during each entry period.
  3. Reply using the speech bubble/comment icon underneath the @msdev tweet with the following required elements:
    • Answering the prompted question. No graphic or GIF is required.
    • Include the two campaign hashtags by typing #Sweepstakes #FuelMyAwesome in the body of your reply
  4. Hit send!

Am I eligible to win?

You need to be:

  • A legal resident of the 50 United States (including the District of Columbia)
  • 18 years of age or older
  • If you’re under 18, you need to have consent of a parent or legal guardian

You cannot be:

  • An employee of Microsoft Corporation and its subsidiaries, affiliates, advertising agencies, and Sweepstakes Parties or one of their family members.

What can I win?

Each prize pack includes a box with some sweet goodies, including an LED screen, a hoodie, a giant enter button, and more!

How many winners are there each week?

We’ll randomly select 10 lucky winners each week.

How do I find out if I won?

Check your mentions on Fridays! We’ll reply to the week’s winners every Friday. Be sure to DM us within 72 hours so we can confirm that you’re eligible and get your address to send you your prize!

Can I enter every week to win?

Yeah, go for it!

If I already won once, can I win again?

Sorry, only one prize pack per person.

Can I enter to win more than once per week?

You can only enter to win once per week. You can reply as many times as you want, but multiple replies within the weekly entry period will not increase your chances of winning.

*No Purchase Necessary. Open only to legal residents of the 50 U.S. + DC 18+. Game ends 12/6/18. For details, see Official Rules.

Viewing all 10804 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>