Quantcast
Channel: Category Name
Viewing all 10804 articles
Browse latest View live

Share your Azure DevOps feature suggestions with us

$
0
0
Customer feedback is critical to helping us improve Azure DevOps. Over the years we’ve addressed thousands of issues and suggestions originating from our users through many different channels. In order for us to collaborate with you more effectively, we’ve been improving our feedback channels along the way so that they provide us more real time... Read More

Because it’s Friday: Parable of the Polygons

$
0
0

What if we lived in a society where everyone really happy about living in a diverse neighborhood? What if people only wanted to move when the disparity was really extreme: say, when fewer than 33% of people nearby looked like them? Well, we'd end up with a society like this:

Polygons

Still pretty segregated, as it turns out. At Parable of The Polygons (via Miles McBain), an interesting interactive website, you can experiment with the levels of individual bias that lead the eponymous Polygons to switch locations, and see the impact on the overall Polygon society. It's quite enlightening.

That's all from us at the blog for this week. Have a great weekend, and we'll be back next week.

 

 

Top Stories from the Microsoft DevOps Community – 2018.10.26

$
0
0
Without a doubt, the biggest news this week is that Microsoft has completed its acquisition of GitHub. The future is bright, and I can’t wait to see how people will continue to use GitHub and Azure DevOps together. This partnership opens a new door of possibilities for developers, and I’m so excited to see what... Read More

Azure.Source – Volume 55

$
0
0

Now in preview

Public preview: Named Entity Recognition in the Cognitive Services Text Analytics API

The Text Analytics API is a cloud-based service that provides advanced natural language processing over raw text, and includes four main functions: sentiment analysis, key phrase extraction, language detection, and entity linking. Named Entity Recognition (NER) is the ability to take free-form text and identify the occurrences of entities such as people, locations, organizations, and more. With just a simple API call, NER in Text Analytics uses robust machine learning models to find and categorize more than twenty types of named entities in any text documents. This adds to the suite of powerful and easy-to-use natural language processing solutions that make it easy to tackle many problems.

Also in preview

Now generally available

What's new in PowerShell in Azure Cloud Shell

PowerShell in Azure Cloud Shell became generally available at Microsoft Ignite 2018. Azure Cloud Shell provides an interactive, browser-accessible, authenticated shell for managing Azure resources from virtually anywhere. With multiple access points, including the Azure portal, the stand-alone experience, Azure documentation, the Azure mobile app, and the Azure Account Extension for Visual Studio Code, you can easily gain access to PowerShell in Cloud Shell to manage and deploy Azure resources.

PowerShell in Azure Cloud Shell GA | Azure Friday

Danny Maertens and Scott Hanselman discuss updates made to PowerShell in Azure Cloud Shell. Learn about PowerShell Core in Linux, new Azure VM remoting cmdlets, and integration with Exchange.

Also generally available

News and updates

Azure Update Management: A year of great updates

You can use the Update Management solution in Azure Automation to manage operating system updates for your Windows and Linux computers that are deployed in Azure, in on-premises environments, or in other cloud providers. You can quickly assess the status of available updates on all agent computers and manage the process of installing required updates for servers. You can enable Update Management for virtual machines directly from your Azure Automation account. You can also enable Update Management for a single virtual machine from the virtual machine pane in the Azure portal for Linux and Windows virtual machines. Read this post to learn about the new features we added over the past year in response to your feedback, including groups with dynamic membership, onboarding new machines into Update Management, pre/post scripts to invoke Azure automation runbooks, update inclusion, and reboot control.

Control and improve your security posture with Azure Secure score

Announced last month at Ignite, Secure score is a security analytics tool that provides visibility of your organization’s security posture and helps to answer the most important question, “How secure is my workload?” Secure score takes into consideration the severity and the impact of the recommendation. Based on that information, it assigns a numerical value to show how fixing this recommendation can improve your security posture. The overall Secure score is an accumulation of all your recommendations. You can view your overall Secure score across your subscriptions or management groups, depending on the scope you select. The score will vary based on subscription selected and the active recommendations on these subscriptions.

Screenshot of Azure Secure score in the Azure portal

Azure Availability Zones expand with new services and to new regions

Availability Zones is a high-availability offering that protects your applications and data from datacenter failures. Availability Zones is now available in the North Europe and West US 2 regions. Zone-redundant services replicate your applications and data across Availability Zones to protect from single-points-of-failure. An expanded list of zone-redundant services including Azure SQL Database, Service Bus, Event Hubs, Application Gateway, VPN Gateway, and ExpressRoute is now available. With Availability Zones, Azure offers industry best 99.99% VM uptime SLA.

Additional news and updates

Azure shows

Episode 252 - Azure DevOps in the Enterprise | The Azure Podcast

Microsoft AppDev Consultant Stewart Viera talks to us about the power of Azure DevOps and some of the things to watch out for when using it in an Enterprise setting.

Fine-grained security with Apache Ranger on HDInsight Kafka | Azure Friday

Dhruv Goel and Scott Hanselman discuss how to integrate Kafka with Azure Active Directory for authentication and set up fine-grained access control with Apache Ranger to let multiple users access Kafka easily and securely. With Azure HDInsight, you get the best of open source on a managed platform.

Bring your own keys on Apache Kafka with Azure HDInsight | Azure Friday

Dhruv Goel and Scott Hanselman discuss how to get even more control over the security of your data at rest with Bring Your Own Key encryption for Kafka. With Azure HDInsight, you get the best of open source and the security and reliability of a managed platform.

Update Mongoose OS with Automatic Device Management in Azure IoT Hub | IoT Show

The Azure IoT Hub Automatic Device Management feature is now generally available and you can see how it is used in this new episode of the IoT Show to update the firmware of devices running Mongoose OS over the air.

ThreadX integration with Azure IoT Hub | IoT Show

Express Logic's Industrial Grade X-WARE IoT Platform now includes the latest version of the Azure IoT SDK, providing Azure connectivity over MQTT and Automatic Device Management support 'out of the box' for users of the real time operating system ThreadX. Watch to find out what it means to be industrial grade in this conversation between Olivier and John Carbone from Express Logic.

Train Machine Learning Models with Azure ML in VS Code | AI Show

In this episode, we will provide step by step guidance on how to train machine learning models using the Visual Studio Code Tools for AI extension and Azure Machine Learning service.

Technical content

Seven best practices for Continuous Monitoring with Azure Monitor

Azure Monitor helps you understand how your applications are performing and proactively identifies issues affecting them and the resources they depend on. Continuous Monitoring (CM) is a new follow-up concept to DevOps for incorporating monitoring across each phase of your DevOps and IT Ops cycles. This ensures the health, performance, and reliability of your apps and infrastructure continuously as it flows through the developer, production, and customers. Check out this post for seven best practices for Continuous Monitoring.

Diagram showing Azure Monitor working with Azure DevOps

How developers can get started with building AI applications

Artificial intelligence (AI) is accelerating the digital transformation for every industry, with examples spanning manufacturing, retail, finance, healthcare and many others. At this rate, every industry will be able to use AI to amplify human ingenuity. In this free O'Reilly e-book, A Developer’s Guide to Building AI Applications, Anand Raman and Wee Hyong Tok from Microsoft provide a comprehensive roadmap for developers to build their first AI-infused application.

Cover image of A Developer's Guide to Building AI Applications

IoT device authentication options

This blog post covers the pros and cons of some of the most widely used authentication types supported by the Azure IoT Hub Device Provisioning Service and Azure IoT Hub. In addition, this post includes a wealth of links to Azure IoT security resources, such as white papers, Azure IoT partners, and related blog posts.

Azure Cosmos DB partitioning design patterns – Part 1

In this first part of a multi-part series of posts, see how you can use partition keys in Azure Cosmos DB to efficiently distribute data, improve application performance, and enable faster look-up. This post demonstrates using an Azure Function to consume the change feed to update lookup collections. Lookup collections can make your application more efficient in how it uses request units by using partition keys to avoid costly fan-out queries.

Solution diagram showing Azure Functions working with Azure Cosmos DB Change Feed to update lookup collections

Query Azure Storage analytics logs in Azure Log Analytics

Log Analytics is a service that collects telemetry and other data from a variety of sources and provide a query language for advanced analytics. Learn how to convert Azure Storage analytics logs and post to Azure Log Analytics workspace. Then, you can use analysis features in Log Analytics for Azure Storage (Blob, Table, and Queue). A sample Powershell script is provided to show how to convert Storage Analytics log data to JSON format and post the JSON data to a Log Analytics workspace.

Additional technical content

Azure Tips & Tricks

How to prevent changes to resources in Azure App Services | Azure Tips & Tricks

Thumbnail from How to prevent changes to resources in Azure App Services on YouTube

Learn how to prevent changes to resources in the Azure App Service. In the Azure App Service, you can easily create locks to prevent other users in your organization from accidentally deleting or modifying your resources.

How to work with the Azure CLI using MacOS | Azure Tips & Tricks

Thumbnail from How to work with the Azure CLI using MacOS on YouTube

Learn how to get started with the Azure CLI on your MacOS machine. You can easily install the Azure CLI version 2.0 on MacOS by using Homebrew.

Azure DevOps

Using Azure Pipelines for your Open Source Project | The DevOps Lab

Damian speaks to Edward Thomson about how to get started with Azure Pipelines - right from GitHub. The deep integration and GitHub Marketplace app for Azure Pipelines makes it incredibly easy to build your projects no matter what language you're using. You can even use the builds as part of your PR checks.

Lori Lamkin on Shifting to Azure DevOps - Episode 007 | The Azure DevOps Podcast

Lori Lamkin and Jeffrey Palermo discuss what’s next for Lori as Director of PM, her strategy behind leading the big shift from VSTS to Azure DevOps, the current roles and duties within Microsoft Azure DevOps, what she sees as the biggest shift in progressing from Agile and adopting DevOps, and how DevOps has become more and more efficient.

Customers and partners

Expanding the HashiCorp partnership

Prior to speaking at HashiConf '18, Brendan Burns posted about our growing partnership with HashiCorp. Our engineering teams collaborated to expand the resources and services supported by the Terraform provider for AzureRM and added several new integrations with HashiCorp Vault. From Terraform updates in Azure Cloud Shell to Consul Integrations on Azure, this progress has set a strong foundation for future collaboration and advancements in the Cloud and DevOps space.

Screenshot of working with Terraform in Azure Cloud Shell

Building an IoT ecosystem that fuels business transformation

Reducing the complexity of IoT and accelerating time to value is a major differentiator that, when combined with our technology innovations, offers unsurpassed value to both our partners and customers. IoT in Action is a global event series that provides an opportunity for you to connect with other partners in the IoT ecosystem and learn more about the disruptive IoT solutions that are transforming the way business gets done. This post covers examples of Microsoft's partner + platform approach to illustrate how we're connecting partners and bridging their capabilities to expedite time to value for both customers and partners.

What they know now: Insights from top IoT leaders

Hear from industry leaders Henrik Fløe of Grundfos, Doug Weber from Rockwell Automation, Michael MacKenzie from Schneider Electric, and Alasdair Monk of The Weir Group on why they’re bullish on all things IoT, and how they’re leveraging it to innovate and grow.

It’s time to connect your products — here’s why

Connected products are already a part of everyday life. Smart thermostats remotely control home heating and cooling. Printers automatically order new ink cartridges before they run out. Wearable medical devices capture data that helps drive health decisions. To capture your share of this market by offering connected products, clarifying the right use case is the first big step. There are three main approaches: monitoring, control, and automation. But connected products require careful planning to succeed. Read this post to learn more about the pitfalls to avoid and where to get in-depth guidance.

Azure Marketplace new offers – Volume 22

The Azure Marketplace is the premier destination for all your software needs – certified and optimized to run on Azure. Find, try, purchase, and provision applications & services from hundreds of leading software providers. You can also connect with Gold and Silver Microsoft Cloud Competency partners to help your adoption of Azure. In September, 64 offers from Cognosys, Inc. successfully met the onboarding criteria and went live. Cognosys, Inc. continues to be a leading Marketplace publisher, with more than 315 solutions available.

 


Azure This Week - 26 October 2018 | A Cloud Guru

Thumbnail from Azure This Week - 26 October 2018 on YouTube

This time on Azure This Week, Lars talks about Azure Digital Twins in public preview, ONNX runtime for machine learning now in preview, and US governments get more Azure love.

Cross-channel emotion analysis in Microsoft Video Indexer

$
0
0

Many different customers across industries want to have insights into the emotional moments that appear in different parts of their media content. For broadcasters, this can help create more impactful promotion clips and drive viewers to their content; in the sales industry it can be super useful for analyzing sales calls and improve convergence; in advertising it can help identify the best moment to pop up an ad, and the list goes on and on. To that end, we are excited to share Video Indexer’s (VI) new machine learning model that mimics humans’ behavior to detect four cross-cultural emotional states in videos: anger, fear, joy, and sadness.

Endowing machines with cognitive abilities to recognize and interpret human emotions is a challenging task due to their complexity. As humans, we use multiple mediums to analyze emotions. These include facial expressions, voice tonality, and speech content. Eventually, the determination of a specific emotion is a result of a combination of these three modalities to varying degrees.

While traditional sentiment analysis models detect the polarity of content – for example, positive or negative – our new model aims to provide a finer granularity analysis. For example, given a moment with negative sentiment, the new model determines whether the underlying emotion is fear, sadness, or anger. The following figure illustrates VI’s emotion analysis of Microsoft CEO Satya Nadella’s speech on the importance of education. At the very beginning of his speech, a sad moment was detected.

Microsoft CEO Satya Nadella's speech on the importance of education

All the detected emotions and their specific appearances along the video are enumerated in the video index JSON as follows:

video index JSON

Cross-channel emotion detection in VI

The new functionality utilizes deep learning to detect emotional moments in media assets based on speech content and voice tonality. VI detects emotions by capturing semantic properties of the speech content. However, semantic properties of single words are not enough, so the underlying syntax is also analyzed because the same words in a different order can induce different emotions.

Syntax

VI leverages the context of the speech content to infer the dominant emotion. For example, the sentence “… the car was coming at me and accelerating at a very fast speed …” has no negative words, but VI can still detect fear as the underlying emotion.

VI analyzes the vocal tonality of speakers as well. It automatically detects segments with voice activity and fuses the affective information contained within with the speech content component.

Video Indexer

With the new emotion detection capability in VI that relies on speech content and voice tonality, you are able to become more insightful about the content of your videos by leveraging them for marketing, customer care, and sales purposes.

For more information, visit VI’s portal or the VI developer portal, and try this new capability for free. You can also browse videos indexed as to emotional content: sample 1, sample 2, and sample 3.  

Have questions or feedback? We would love to hear from you!

Use our UserVoice to help us prioritize features, or email VISupport@Microsoft.com with any questions.

Microsoft again recognized as a Leader in Gartner Magic Quadrant for Content Services Platforms

$
0
0

Microsoft has pioneered intelligent content services in Microsoft 365 to help our customers achieve more. And the industry has taken note. Today, we are honored that for the second year in a row Gartner has recognized Microsoft as a Leader in the Content Services Platforms Magic Quadrant for 2018. Once again, Microsoft placed highest in Ability to Execute.

Gartner defines a content services platform as:

“A set of services and microservices, embodied as an integrated product suite and applications that share common APIs and repositories, to exploit diverse content types and to serve multiple constituencies and numerous use cases across an organization.”

We couldn’t agree more. SharePoint is the content services platform for Microsoft 365, with capabilities for creating, sharing, protecting, and reusing information. SharePoint hosts digital content like pages, lists, videos, images, designs, 3D, medical scans, and markup as well as traditional Office documents. SharePoint embodies ease of use with ease of management—on any device, for any user, at any location. Importantly, SharePoint collects and extends content experiences to any application in Office 365—including Microsoft Teams and Outlook and, of course, third-party applications as well.

These innovations—along with customers’ transition to the cloud and the growing imperative for secure content collaboration and sharing—are driving growth across Microsoft 365, SharePoint, and OneDrive. More than 350,000 organizations now have SharePoint and OneDrive in Microsoft 365, including 90 percent of the Fortune 500. Active users grew over 90 percent, and data stored in SharePoint grew over 250 percent in the last year alone.

Additionally, Microsoft is the only company recognized as a leader in both the Content Collaboration Platforms and Content Services Platforms Magic Quadrant reports.

Image of the Gartner Magic Quadrant for 2018.

We feel this placement is a further indication of our commitment to our customers, recognizing that Microsoft provides leading content services capabilities, including:

  • SharePoint is easy to set up, with a simple and clean user interface paired to easy but powerful management, deeply integrated with OneDrive so you can take your files anywhere.
  • SharePoint empowers you to create beautiful, engaging pages and knowledge portals, enriched with video from Stream and conversations from Yammer, to connect across your organization.
  • We combine the power of artificial intelligence (AI) with content to help you be more productive, make more informed decisions, and keep more secure. This includes video and audio transcription, mobile capture to SharePoint using the top-rated OneDrive mobile app, as well as other features to make Microsoft 365 the smartest place to store your content.
  • Microsoft 365 is easily connected and automated via Microsoft Flow to over 200 cloud and on-premises systems, enriched with Azure cognitive services, and extended with PowerApps and the SharePoint framework for custom development.
  • SharePoint delivers content across Microsoft 365. Whether you are a Firstline Worker working remotely, co-authoring in Office apps, emailing cloud attachments in Outlook, or collaborating with your team in the new chat-based workspace Microsoft Teams, SharePoint provides a consistent set of experiences across the applications.
  • SharePoint can store any file and now supports viewing of over 320 file formats, including Adobe Photoshop (PSD), Illustrator (AI), Acrobat (PDF), as well as video, 3D formats, and DICOM images.
  • With Microsoft Search, we’re introducing new search experiences into the apps you use every day, including Bing.com and Windows, and our vision to connect across your organization’s network of data.
  • SharePoint supports customers ranging in size from small businesses to organizations with hundreds of thousands of users, and has a maximum tenant capacity of 30 trillion documents.
  • SharePoint leverages Microsoft security capabilities such as Advanced Data Governance for automated retention and records management, Data Loss Protection (DLP), eDiscovery, and MIP-based encryption, all with consistent controls across Microsoft 365.
  • With 100+ global datacenters and Microsoft’s global network edge—combined with compliance standards, including ISO 27001, FISMA, and EU Model Clauses—we offer customers trusted enterprise-grade compliance and security.

At Microsoft Ignite last month, we shared exciting Microsoft 365 innovations that build on this foundation. And our innovations are amplified by the depth and breadth of our community. This year, we welcomed a limited number of Microsoft Preferred Partners to our Content Services Partner Program. Each is recognized for proven customer success to modernize, embrace, and extend our content services platform.

To learn more about how SharePoint can help you and your organization, visit our website and download our content services white paper. Finally, download your own complimentary copy of the Gartner Content Services Platforms Magic Quadrant.

This graphic was published by Gartner, Inc. as part of a larger research document and should be evaluated in the context of the entire document. Gartner does not endorse any vendor, product, or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.

The post Microsoft again recognized as a Leader in Gartner Magic Quadrant for Content Services Platforms appeared first on Microsoft 365 Blog.

Azure DevOps Roadmap update for 2018 Q4

$
0
0
In order to provide you with visibility into several of our key investments, we post quarterly updates to the roadmap on our Features Timeline page. Today, we’re sharing the latest for the final quarter of this calendar year. You’ll notice items are grouped by the quarter we anticipate delivering the feature to the hosted service... Read More

Local testing with live data means faster development with Azure Stream Analytics

$
0
0

We are excited to announce that live data local testing is now available for public preview in Azure Stream Analytics Visual Studio tools. Have you ever thought of being able to test Azure Stream Analytics queries logic with live data without running in the cloud? Are you excited by the possibility of not having to wait for your queries to deploy and other round-trip delays? Well, this enhancement lets you test your queries locally while using live data streams from cloud sources such as Azure Event Hubs, IoT Hub or Blob storage. You can even use all the Azure Stream Analytics time policies in the local Visual Studio environment. Being able to start and stop queries in seconds and local debugging significantly improves development productivity by saving precious time on the inner loop of query logic testing.

ASALiveTesting2

Live data local testing experience

The new local testing runtime can read live streaming data from the cloud or from a local static file. It works the same as the cloud runtime Azure Stream Analytics and therefore supports the same time policies needed for many testing scenarios. The query runs in a simulated environment suitable for a single server development environment and should only be used for query logic testing purposes. It is not suitable for performance/scalability and availability testing.

Key benefits

Query behavior consistency

For any given input, the queries running locally and in the cloud will output the same results. Live data testing in Visual Studio includes full support for time polices and machine learning functions.

Much shorter testing cycles

Compared to running in the cloud, it takes only seconds to start a round of query testing. It enables you to test your query directly in Visual Studio without having to switch environments. You can understand the shape of the data coming to easily adjust the query and see immediately the result on the live data.

Viewing job metrics and error messages

Job metrics such as input data size, output data size, and more are available during local execution. Error messages will also be displayed in testing and debugging.

Fluid experience

  • The query runs locally but connects to input data cloud sources directly. No need to sample data to local first.
  • Query results are immediately displayed in table format in Visual Studio.

Local testing options

In addition to the current local run capability, with this major improvement, we now support all the local testing options listed below.

Input

Output

Job Type

Local static data

Local static data

Cloud/Edge

Live input data

Local static data

Cloud

Live input data

Live output data

(Power BI and ADLS are not supported at this moment)

Cloud

Local testing vs. cloud run caveats

This local testing feature should only be used for functional testing purposes. It does not replace the performance and scalability tests you would do in the cloud. It should not be used for production purposes since running in a local environment doesn’t guarantee any service SLA.

While on your machine you can only rely on local resources, when you deploy to the cloud we can scale out to multiple nodes. Additionally, cloud deployment ensures checkpointing, upgrades, and more. It also provides all the infrastructure to run your job 24/7.

The following table shows the comparison between running a query in a local environment versus running it in the cloud. Please note, that only the cloud input options support time policies and machine learning functions, the local input doesn’t support these.

 

Query logic

JavaScript UDF/UDA

Machine Learning function

Time policies

Execution Environment

Resource Allocation

Service SLA

Scalability

Local testing with local input

Yes

Yes

No

No

Local simulated,

No CPU/memory

commitment

No

No

Local testing with live input

Yes

Yes

Yes

Yes

Local simulated

No CPU/memory

commitment

No

No

Running in the cloud

Yes

Yes

Yes

Yes

Cloud

By SU

Yes

Yes

Try it now

This feature is amazingly easy to use. Just choose the input/output type you want in the script editor and click “Run locally”. Follow this guide on Testing live data locally using Azure Stream Analytics tools for Visual Studio to try it!

Providing feedback and ideas

The Azure Stream Analytics team is highly committed to listening to your feedback. We welcome you to join the conversation and make your voice heard via our UserVoice. For tools feedback, you can also reach out to ASAToolsFeedback@microsoft.com.

Did you know we have more than ten new features in public preview? Sign up for our preview programs to try them out. Also, follow us @AzureStreaming to stay updated on the latest features.


Design patterns – IoT and aggregation

$
0
0

In this article, you will learn how to insert IoT data with high throughput and then use aggregations in different fields for reporting. To understand this design pattern, you should already be familiar with Azure Cosmos DB and have a good understanding of change feed, request unit (RU), and Azure Functions. If these are new concepts for you, please follow the links above to learn about them.

Many databases achieve extremely high throughput and low latency because they partition the data. This is true for all NoSQL database engines like MongoDB, HBase, Cassandra, or Azure Cosmos DB. All these databases can scale out unlimitedly because of partitioning or sharding.

Let us look at Azure Cosmos DB more closely. On the top level, a container is defined. You can think of container as a table, a collection, or a graph but this is the main entity which holds all the information. Azure Cosmos DB uses the word “container” to define this top-level entity, and because Azure Cosmos DB is a multi-model database this container is synonymous with collections for SQL, MongoDB, and Graph APIs, and Tables for Cassandra or Table APIs.

A collection has many physical partitions which are allocated based on the throughput requirement of the collection. Today, for 10000 RU you may get ten partitions, but this number may change tomorrow. You should always focus on the throughput you want and should not worry about the number of partitions you got allocated, as I said, this number of partitions will change with data usage.

To achieve high scale throughput and low latency, you need to specify a partition key and row key while inserting the data and use the same partition key and row key while reading the data. If you choose the right partition key then your data will be distributed evenly across all the partitions, and the read and write operations can be in single digit milliseconds.

Internally, Azure Cosmos DB uses hash-based partitioning. When you write an item, Azure Cosmos DB hashes the partition key value and uses the hashed result to determine which partition to store the item in. A good partition key will distribute your data equally among all the available partitions as shown in the figure below.

GoodPartition

Good partition key, data is equally distributed

One to one mapping between partition keys and physical partitions doesn’t exist, meaning a single physical partition can store many keys. This is because on is a logical concept (partition key) and the other is a physical concept. Often novice users think that a partition key is equal to a physical partition. Please remember, one is a logical concept and the other is a physical concept, they are not mapped one to one. Each key is hashed and then using modulo operators, it is mapped to a partition. Each logical partition can store 10 GB of data, this limit may change in future and will automatically split when the data grows to more than 10 GB. Though you never have to worry about splitting partitions yourself, Azure Cosmos DB does it behind the scene. However, you should never have a partition key that may have more than 10 GB of data. 

A million partition keys will not create a million physical partitions.

Now, let’s look at an example. You are working in an IoT company, which has IoT devices installed in buildings to maintain the temperature, and you have hundreds of thousands of customers all around the world. Each customer has thousands of IoT devices which are updating the temperature every minute. Let’s define how the data will look:

{
     CustomerId: Microsoft,
     DeviceId: XYZ-23443,
     Temperature: 68
     DateTime: 32423512
}

Imagine you have a customer which is a global company with offices in every country and has 100,000 IoT devices distributed. These devices are sending 2 KB of data every minute for a daily total of 2 GB. At this rate, you may fill the partition in five days. you can use a time to live (TTL) mechanism to delete the data automatically, but for our example let’s assume you have to keep this data for 30 days.

If you choose “CustomerId” as the partition key, you will see your data is skewed for large customers and your partitions will look as shown below.

BadPartition

This kind of partitioning will also create throttling for large customers, who have thousands of IoT devices and are inserting the data in a collection partitioned on “CustomerId”. You may wonder why it will be throttled? To understand that, imagine your collection is defined to have 5000 RU/Sec and you have five partitions. This means each partition throughput can have 1000 RU.

Please note, we said here five partitions but this number is again here for discussion's sake. This number may change in the future for your throughput. With the change in hardware, tomorrow you may just get three partitions or one for 5000 RU. Remember, these physical partitions are not constant, they will keep splitting automatically as your data will keep growing.

Users often make this mistake and then complain that they are being throttled at 2000 RU even if they have provisioned the collection for 5000 RU. In this scenario, the main issue is their data is not partitioned properly and they are trying to insert 2000 RU into one partition. This is why you need to have a good partition key that can distribute your data evenly across all partitions.

If “CustomerId” is not a good partition key, then what other keys we can use? You will also not like to partition the data on “DateTime”, because this will create a hot partition. Imagine you have partitioned the data on time, then for a given minute, all the calls will hit one partition. If you need to retrieve the data for a customer then it will be a fan-out query because data may be distributed on all the partitions.

To choose the right partition keys, you must think and optimize for read or write scenarios. For simpler scenarios, it is possible to get a partition key for both read and write scenarios. However, if that is not the case then you must compromise and optimize for one of the scenarios.

In this article, we are exploring the scenario in which we do not have one right partition key for both read and write. Let’s see what we can do to satisfy both the read and write requirements.

In this scenario, we have to optimize for writing because of a high number of devices sending us the data. It is best to define the collection with “DeviceId” as the partition key for fast ingestion. “DeviceId” is not only unique but also more granular than “CustomerId”. Always look for a key with more cardinality or uniqueness so your data will be distributed across all partitions. However, what if for reporting you want to do aggregation over “CustomerId”?

This is the crux of this blog. You would like to partition the data for the insert scenario and also group the data on a different partition key for the reporting scenario. Unfortunately, these are mismatched requirements.

Imagine, you have inserted the data with “DeviceId” as the partition key, but if now you want to group by temperature and “CustomerId”, your query will be a cross-partition query. Cross-partition queries are ok for once a while scenario. Since all data is indexed by default in Azure Cosmos DB, cross-partition queries are not necessarily bad things, but they can be expensive. Cross-partition queries cost you much more RU/s than point lookups.

You have two options to solve this problem. Your first option is to use Azure Cosmos DB’s change feed and Azure Function to aggregate the data per hours and then store the aggregated data in another collection, where “CustomerId” is the partition key.

ChangefeedReportingChange Feed

You can again listen to the change feed of reports per hour collection to aggregate the data for per day and store the aggregation in another Azure Cosmos DB reports per day. IoT devices send data directly to Cosmos DB. This pattern is possible because of change feed. Change feed exposes the log of Cosmos DB. The change feed includes inserts and updates operations made to documents within the collection. Read more about change feed. However, please know that change feed is enabled by default for all account and for all collections.

To learn more about how to use change feed and azure function, check this screen cast.

The second option is to use Spark, to do the aggregation, and keep the aggregated value in SQL data warehouse or a second collection where the partition key is CustomerId.

ChangefeedSparkThis option will also use the change feed. From Spark, you can connect directly to change feed and get all the changes in Spark at the real-time. Once the data is in Spark, you can do the aggregation and then write that data back to Azure Cosmos DB or to SQL Data Warehouse.

Here is the code snippet for Spark to read the data from Azure Cosmos DB, do the aggregation and write back the data.

# Base Configuration
iotConfig = {
"Endpoint" : "https://xx.documents.azure.com:443/",
"Masterkey" : "E0wCMaBIz==",
"Database" : "IoT",
"preferredRegions" : "Central US;East US2",
"Collection" : "IoT",
"checkpointLocation" : "dbfs://checkpointPath"
}
# Connect via Spark connector to create Spark DataFrame
iot_df = spark.read.format("com.microsoft.azure.cosmosdb.spark").options(**iotConfig).load()
iot_df.createOrReplaceTempView("c")
psql = spark.sql ("select DeviceId, CustomerId, Temp from c")
 
writeConfig = {
"Endpoint" : "https://xx.documents.azure.com:443/",  
"Masterkey" : "E0wCMaBKdlALwwMhg==",
"Database" : "IoT",
"preferredRegions" : "Central US;East US2",
"Collection" : "MV",
"Upsert" : "true"
     }
iot_df.createOrReplaceTempView("c")
psql = spark.sql ("select CustomerId, avg(temp) as Temp_Avg from c group by c.CustomerId ")
psql.write.format("com.microsoft.azure.cosmosdb.spark").mode('append').options(**writeConfig).save()

Check the screen cast to learn how to use Spark with Azure Cosmos DB.

Both the options can give you per minute aggregation by listening to the live change feed. Depending upon your reporting requirement, you can keep different aggregations at different levels in different collections or in the same collection. This is another option you can have to keep these aggregated values to SQL data warehouse.

A first look at changes coming in ASP.NET Core 3.0

$
0
0

While we continue to work on finalizing the next minor version of ASP.NET Core, we’re also working on major updates to our next release that will include some changes in how projects are composed with frameworks, tighter .NET Core integration, and 3rd party open source integration, all with the goal of making it easier and faster for you to develop. For broader context around .NET Core 3.0, we encourage you to check out our previous announcements around the addition of WinForms and WPF support to .NET Core 3.0 on Windows. We’ll publish more details about new features coming in ASP.NET Core 3.0 in the near future.

Packages vs. Frameworks

For historical context, the way projects reference and run on ASP.NET Core has changed through the versions and years. In 1.0, ASP.NET Core itself was “just packages”, and appeared in projects like any other NuGet package reference. This had benefits and drawbacks, and over time we’ve evolved this model to try and balance the advantages of modular references with those of larger, prerequisite frameworks. In 2.1, ASP.NET Core eventually evolved to be available as a .NET Core “shared framework” (like the base of .NET Core itself, Microsoft.NETCore.App, has been since 1.0). This blog post by ASP.NET Core team member Nate McMaster does a good job of explaining how the shared framework works while also highlighting some of the issues with the current approach. Updates we’re introducing in 3.0 are designed to reduce these issues for all our users.

As part of this change, some notable sub-components will be removed from the ASP.NET Core shared framework in 3.0:

  • NET (Newtonsoft.Json)
  • Entity Framework Core (Microsoft.EntityFrameworkCore.*)

See this announcement for more details regarding the addition of JSON APIs in .NET Core. For places in ASP.NET Core that rely on Json.NET feature today (e.g. the JSON formatter in MVC), we’ll continue to ship packages that provide that integration moving forward, however the default experiences will change to use the upcoming in-box JSON APIs.

Entity Framework Core will ship as “pure” NuGet packages in 3.0. This makes its shipping model the same as all other data access libraries on .NET, and allows it the simplest path to continue innovation while providing support for all the various .NET platforms customers enjoy it on today. Note, Entity Framework Core moving out of the shared framework has no impact on its status as a Microsoft developed, supported, and serviceable library, and it will continue to be covered by the .NET Core support policy.

Fully leveraging .NET Core

As announced on the .NET Blog earlier this month, .NET Framework will get fewer of the newer platform and language features that come to .NET Core moving forward, due to the in-place update nature of .NET Framework and the desire to limit changes there that might break existing applications. To ensure ASP.NET Core can fully leverage the improvements coming to .NET Core moving forward, ASP.NET Core will only run on .NET Core starting from 3.0. Moving forward, you can simply think of ASP.NET Core as being part of .NET Core.

Customers utilizing ASP.NET Core on .NET Framework today can continue to do so in a fully supported fashion using the 2.1 LTS release. Support and servicing for 2.1 will continue until at least August 21, 2021 (3 years after its declaration as an LTS release) in accordance with the .NET Core support policy.

For more information about porting from .NET Framework to .NET Core, see this documentation.

Delivering more value with focused 3rd party open-source integration

At the same time we’re drawing a clearer distinction on what constitutes the “platform” in 3.0, and in doing so removing 3rd party components from that layer, we recognize that many higher-level scenarios are best assisted by established, capable, and well supported open-source components, and that we can provide support and assistance to the community and our customers by helping to ensure these components integrate as well as possible into ASP.NET Core applications.

This support will take different forms, including first-class integration APIs & packages built by our team, contributions made to existing libraries by our engineers, project templates in the default experiences that utilize these libraries, documentation that lives on the official ASP.NET Core docs site, and processes for dealing with critical issues and bug fixes, including security.

We’ve already begun this in the 2.2 wave, with new integration being developed for the popular IdentityServer library, which will help us deliver a simple and functional story for API Authorization in ASP.NET Core applications, while allowing customers to leverage the full power of IdentityServer when they need to.

We’re also working on streamlining the experience for building HTTP APIs, with new API Conventions and analyzers that make working with popular Open API libraries like Swashbuckle and NSwag easier, and a new API client generation system that allows for simple integration with code generators like AutoRest and NSwag.

For folks excited by our new Health Checks feature, the owners of the BeatPulse library are working to port their extensive library of checks over.

We intend to bring these experiences together in a new project template to be made available in the period after 2.2 ships.

Conclusion

Stay tuned for more updates as we continue our updates to ASP.NET Core in .NET Core 3.0, including a summary of the new features we’re working to enable as part of this release. We also regularly post details of changes and other information on our announcements repo, which we encourage you to subscribe to.

Bringing digital ledger interoperability to Nasdaq Financial Framework through Microsoft Azure Blockchain

$
0
0

Nasdaq revolutionized financial markets in 1971 by opening them up to millions of individual investors with the world’s first-ever electronic stock exchange. Since then, the company’s innovative spirit has kept them on the leading edge of technology evolution in capital markets, while transforming the brand into a powerhouse technology provider.

Nasdaq’s commercial technology arm, Market Technology, powers mission-critical capital markets infrastructure at more than 100 exchanges, clearinghouses, and central securities depositories across 50 countries.

To better serve its global clientele, Nasdaq re-architected its enterprise offerings and released the Nasdaq Financial Framework (NFF) in 2016. The Framework’s modular design leverages a single operational core that powers a vast portfolio of business applications and services across the trading lifecycle, enabling Nasdaq customers to easily add, remove and amend their mission-critical technology stack without the cost and complexity associated with monolithic infrastructure systems.

NFF was designed also to take advantage of the benefits of emerging technologies to enable key strategic advantages for customers.

“Our [capital markets] industry is evolving faster than ever with the advent and advancement of cloud, blockchain, machine intelligence and others. Key players in the industry are looking to these technologies to explore how they can become more effective and efficient, but also gain competitive advantage,” says Magnus Haglind, Senior Vice President and Head of Product Management for Nasdaq’s market technology business.

To accelerate Nasdaq’s blockchain capabilities aligned with the industry’s rising demand, the company is integrating the Nasdaq Financial Framework with Microsoft Azure Blockchain to build a ledger agnostic blockchain capability that supports a multi-ledger strategy.

Nasdaq

Azure will deliver highly secure interoperation and communication between the Nasdaq Financial Framework core infrastructure, ecosystem middleware and customer technologies, using an innovative blockchain microservices suite to execute transactions and contracts.

These offerings create a sophisticated blockchain system that allow various technologies to work together in a secure, scalable way, and enable Nasdaq to meet their customer requirements across multiple projects.

Nasdaq sees immediate opportunity for blockchain to manage the delivery, payment, and settlement of transactions that may reside on multiple blockchains with different payment mechanisms. “With multiple blockchains in use by various industry participants, we believe that the combination of NFF and Microsoft’s blockchain technology can remove some of the project complexities that exist in this realm,” says Tom Fay, Senior Vice President of Enterprise Architecture at Nasdaq. “Additionally, as more industries move towards capital markets technology and structures, we see the potential for blockchain to provide value in secure, frictionless and instantaneous matching of buyers and sellers.”

These capabilities will also allow capital market organizations to use any NFF-based applications that incorporate blockchain technologies without the need for ledger-specific skills or knowledge, which is critical as the industry pursues new use cases for distributed ledger technology.

Fay continues, “from our perspective, Azure is unlocking the power of the blockchain while removing major complexities. Our NFF integration with their blockchain services provides a layer of abstraction, making our offering ledger-agnostic, secure, highly scalable, and ultimately helps us continue to explore a much broader range of customer use cases for blockchain. Delegating to Microsoft – as a best of breed enterprise-quality partner and leader in the blockchain space – to handle all of the semantics of ledger communication, security, deployment and orchestration, allows us to focus on customer challenges and solutions at scale, rather than expending resources on building components that fall outside our core business.”

At Microsoft, we are honored to partner with Nasdaq to evolve their innovative market-enabling technology. As the leading enterprise platform company, we are fundamentally committed to the success of our customers. We are excited to work with Nasdaq to apply cloud and blockchain technologies to industry problems that they deeply understand, and enable their continued success.

Building an ecosystem for responsible drone use and development on Microsoft Azure

$
0
0

The next wave of computing is already taking shape around us as IoT enables businesses to sense all aspects of their business in real-time, and take informed action to running cloud workloads on those IoT devices so they don’t require “always on” connectivity to the cloud to make real-time context-aware decisions. This is the intelligent edge, and it will define the next wave of innovation, not just for business, but also how we address some of the world’s most pressing issues.

Drones or unmanned aircraft systems (UAS) are great examples of intelligent edge devices being used today to address many of these challenges, from search and rescue missions and natural disaster recovery, to increasing the world’s food supply with precision agriculture. With the power of AI at the edge, drones can have a profound impact in transforming businesses and improving society, as well as in assisting humans in navigating high-risk areas safely and efficiently.   

With these advanced capabilities also comes great responsibility including respecting the laws that govern responsible drone use in our airspace, as well as the applications of the drones to scan the environment. We believe it is important to protect data wherever it lives, from the cloud to the intelligent edge.

In addition to building the platforms for innovation, we at Microsoft believe it is crucial to also invest in companies and partnerships that will enable the responsible use of drones and associated data. Today we are making two important announcements that further our commitment to responsible use of drones as commercial IoT edge devices running on Microsoft Azure.

AirMap powered by Microsoft Azure

Today we announced that AirMap has selected Microsoft Azure as the company's exclusive cloud-computing platform for its drone traffic management platform and developer ecosystem.

Drone usage is growing quickly across industries such as transportation, utilities, agriculture, public safety, emergency response, and more, to improve the efficiency and performance of existing business processes. Data generated by drones is infused with intelligence to augment the value that companies and governments deliver to customers and communities. However, concerns about regulatory compliance, privacy, and data protection still prevent organizations from adopting drone technology at scale.

AirMap works with civil aviation authorities, air navigation service providers, and local authorities to implement an airspace management system that supports and enforces secure and responsible access to low-altitude airspace for drones.

With AirMap’s airspace management platform running on Microsoft Azure, the two companies are delivering a platform that will allow state and local authorities to authorize drone flights and enforce local rules and restrictions on how and when they can be operated. Their solution also enables companies to ensure that compliance and security are a core part of their enterprise workflows that incorporate drones.

AirMap selected Microsoft Azure because it provides the critical cloud-computing infrastructure, security, and reliability needed to run these mission-critical airspace services and orchestrate drone operations around the world.

Earlier this year, Swiss Post, the postal service provider for the country of Switzerland, joined with drone manufacturer Matternet and Insel Group, Switzerland’s largest leading medical care system, to fly time-sensitive laboratory samples between Tiefanau Hospital and University Hospital Insel in Bern. This was an alternative to ground transport, where significant traffic can cause life-threatening delays. The operations are supported by Switzerland’s airspace management system for drones, powered by AirMap and Microsoft Azure, for safety and efficiency.

DJI Windows SDK for app development enters public preview

Last May at our Build developer conference, we announced a partnership with DJI, the world’s leader in civilian drones and aerial imaging technology, to bring advanced AI and machine learning capabilities to DJI drones, helping businesses harness the power of commercial drone technology and edge cloud computing.

Today at DJI’s AirWorks conference, we are announcing the public preview of the Windows SDK, which allows applications to be written for Windows 10 PCs that control DJI drones. The SDK will also allow the Windows developer community to integrate and control third-party payloads like multispectral sensors, robotic components like custom actuators, and more, exponentially increasing the ways drones can be used in the enterprise.

With this SDK, we now have three methods to enable Azure AI services to interact with drone imagery and video in real-time:

  1. Drone imagery can be sent directly to Azure for processing by an AI workload.
  2. Drone imagery can be processed on Windows running Azure IoT Edge with an AI workload.
  3. Drone imagery can be processed directly onboard drones running Azure IoT Edge with an AI workload.

We take the security of data seriously, from the cloud to edge devices such as drones. Azure IoT Edge includes an important subsystem, called the security manager, which acts as a core for protecting the IoT Edge device and all its components by abstracting the secure silicon hardware. It is the focal point for security hardening and provides IoT device manufacturers the opportunity to harden their devices based on their choice of hardware secure modules (HSM). Finally, the Azure certified for IoT program only certifies third-party Azure IoT Edge hardware that meets our strict security requirements.

We are encouraged by all the creative applications of drones we are seeing on Azure today across industry sectors, and the following are a few examples of this.

  • SlantRange: Is an aerial remote sensing and data analytics company servicing the information needs of the agriculture industry, with over two decades of experience in earth science, defense, and intelligence applications. The company has patented foundational technologies for aerial crop inspections and introduced innovative analytical methods that deliver valuable agronomic data within minutes of collection, anywhere in the world, using low-power edge computing devices. These technologies have propelled SlantRange's growth from a few Nebraska corn and soybean farms just a few seasons ago to over 40 countries and a wide variety of crops today, including contracts with many of the world's leading agricultural suppliers and producers.
  • Clobotics: Is a global AI company helping wind energy companies to improve its productivity by automatically inspecting, processing, and reporting wind turbine blade defects. The company has inspected over 1,000 wind turbines in the past few months around the world. Clobotics’ end-to-end solutions combine computer vision, machine learning, and data analytics software with commercial drones and sensors to help the wind energy industry automate its inspection service. Clobotics’ Wind Turbine Data Platform, along with its computer vision-based edge computing service, provides a first-of-its-kind wind turbine blade life-cycle management service. As a Microsoft WW Azure partner, Clobotics works closely with Microsoft Azure and the IoT platform to provide the most innovative and reliable services to its enterprise customers.
  • eSmart systems: Brings more than 20 years of global intelligence to provide software solutions to the energy industry, service providers, and smart cities. By bringing AI to the edge on Azure IoT Edge, eSmart Systems are revolutionizing grid inspections. Their Connected Drone solution has the capability to analyze 200,000 images in less than one hour, this is more than a human expert can do in a year. The result is reduced operational cost, better insights into the current status of the grid, and fewer outages.

Earlier this year we announced a $5 billion investment in IoT and the intelligent edge to continue innovation, strategic partnerships, and programs. The latest announcements with strategic partners like AirMap and DJI continue to push the boundaries of what is possible, and we look forward to seeing what our joint customers go on to build.

Plan your next trip – Start a new itinerary on Bing today

$
0
0

This summer we announced the ability to customize Bing itineraries to make them your own, but we know that sometimes it’s nice to start from a clean slate when planning the perfect trip.  Now you can!

To check it out, open My Places in Bing Maps and choose “New Itinerary” from the Itineraries tab.  Next, choose your destination, set your dates if you already know them, and then you’re ready to get started!

New Itinerary

 Now you are ready to add attractions, restaurants, hotels, and more to your itinerary. Click on the “Find attractions” button to see some of the most popular things to do in your destination, or just search for what you are looking for on the map and click “Add to itinerary”.

My Places

My Places isn't the only way to get started, you can also start a new itinerary anywhere you see the save option on the map.

Yellowstone National Park

Happy travels and check back soon, we’ve got more updates coming.  If you have suggestions or feedback, share them using the Feedback link on the Bing Maps page. Itineraries on Bing Maps is only available for US and UK users at this time. 

- The Bing Team

Simplified restore experience for Azure Virtual Machines

$
0
0

Azure Backup now offers an improved restore experience for Azure Virtual Machines by leveraging the power of ARM templates and Azure Managed Disks. The new restore experience directly creates managed disk(s) and virtual machine (VM) templates. This eliminates the manual process of executing scripts or PowerShell commands to convert and configure the .VHD file, and complete the restore operation. There is zero manual intervention after the restore is triggered making it truly a single-click operation for restoring IaaS VMs.

A managed disk ARM template is automatically created in the customer’s storage account during the restore disk operation, which can be deployed to create a VM either as part of the restore operation or a later time. Parameters in the template can also be edited to customize the restored VM as required providing flexibility in the VM creation process. 

In addition to the above-mentioned improvements, naming conventions of the restored disks are now more intuitive to identify the virtual machine associated with the disks during restore operations. The naming conventions are carefully chosen according to the restore path selected by the user namely “Create new” and “Restore disks”.

Create new

Restor configuration

While restoring the VM through the “Create new” flow, the target VM name is prefixed for managed disks along with the date and time of restoration in the following format:

OS disk name

<targetVMName>-osdisk-<yyyymmdd-hhmmss>

Data disk name

<targetVMName>-datadisk-<dno>-<yyyymmdd-hhmmss>
gegMD

Restore disks

While restoring the VM as disks, the source VM name is prefixed for disks along with the date and time of restoration in the following format:

OS disk name

<SourceVMName>-osdisk-<yyyymmdd-hhmmss>

Data disk name

<SourceVMName>-datadisk-<dno>-<yyyymmdd-hhmmss>
gegRG

Related links and additional content

Follow us on Twitter @AzureBackup for the latest news and updates.

Fringe FM conversation on AI Ethics

$
0
0

A few weeks ago, I had a lively conversation with Matt Ward for the Fringe FM podcast, where we discussed artificial intelligence, its applications, and the ethical implications thereof. 

During the conversation I mentioned offhand an article I'd read recently which suggested that program to identify criminals using face recognition in CCTV suffered from a high rate of misidentifications. I couldn't remember the exact rate at the time, and in the quote shown below I said "50% or 90%". Turns out I was being too conservative: the actual rate was 98%. Just as with human-based systems, no AI system is perfect, and classifications are based on a confidence score exceeding some threshold. Set the threshold too low, as was undoubtedly the case here, and the result will be many false positives. It's always important for AI developers to consider the impact of false positives and false negatives, and to take particular care to consider the impact of those negative determinations, especially when they relate to people's lives.

Fringe quote

During the podcast I also mentioned the Microsoft group headed by Kate Crawford which is doing some of the fundamental research into AI ethics, but failed to mention the name of the group: Fairness, Accountability, Transparency and Ethics in AI (FATE). You can find their publications in the Microsoft Research Catalog — the AI Now Report in particular is worth reading for its recommendations.

You can find the Fringe FM podcast in your favorite podcast app (mine is Pocket Casts), or listen to the Episode 46 directly at the link below.

Fringe FM: 46. The Ethics of AI in an Era of Big Data | David Smith


Azure Marketplace new offers – Volume 23

$
0
0

We continue to expand the Azure Marketplace ecosystem. From September 16 to September 30, 2018, 33 new offers successfully met the onboarding criteria and went live. See details of the new offers below:

Virtual machines

1 Click secured Joomla on Ubuntu 18.04 LTS

1 Click secured Joomla on Ubuntu 18.04 LTS: Joomla is an open-source content management system for publishing web content. It is built on a model-view-controller web application framework that can be used independently of the CMS.

AISE PyTorch GPU Notebook

AISE PyTorch GPU Notebook: This is a fully integrated deep learning software stack with PyTorch, an open-source machine learning library for Python; Python, a high-level programming language; and Jupyter Notebook, a browser-based interactive notebook.

AISE PyTorch GPU Production

AISE PyTorch GPU Production: This is a fully integrated deep learning software stack optimized for NVidia GPU. It includes PyTorch, an open-source machine learning library for Python, and Python, a high-level programming language for general-purpose programming.

AISE TensorFlow GPU Notebook

AISE TensorFlow GPU Notebook: This deep learning software stack is optimized for NVidia GPU. It includes TensorFlow, an open-source software library for machine learning; Keras, an open-source neural network library; Python, a programming language; and Jupyter Notebook.

AISE TensorFlow GPU Production

AISE TensorFlow GPU Production: This is a fully integrated deep learning software stack optimized for NVidia GPU. It includes TensorFlow, an open-source software library for machine learning, and Python, a high-level programming language for general-purpose programming.

CentOS 7.4 Hardened - Antivirus & Auto Updates

CentOS 7.4 Hardened - Antivirus & Auto Updates: CentOS 7.4 (Community Enterprise Operating System) is a Linux distribution that aims to provide a free, enterprise-class, community-supported computing platform functionally compatible with its upstream source.

CentOS 7.5 Hardened - Antivirus & Auto Updates

CentOS 7.5 Hardened - Antivirus & Auto Updates: CentOS 7.5 (Community Enterprise Operating System) is a Linux distribution that aims to provide a free, enterprise-class, community-supported computing platform functionally compatible with its upstream source.

Exact Globe

Exact Globe: Exact Globe is an innovative ERP software. New customers can choose from a range of industry- and process-oriented complete business software solutions. Existing customers can expand Globe with additional modules, users, and administrations.

Exivity - Hybrid Cloud Billing Solution

Exivity - Hybrid Cloud Billing Solution: Exivity provides a metering and billing solution that covers virtually any IT service delivery model.

InterSystems IRIS Data Platform Single Node

InterSystems IRIS Data Platform Single Node: This is a complete, cloud-based data platform for SQL and NoSQL database management systems, integration, and analytics.

KNIME Server

KNIME® Server: Regardless of whether you use the KNIME® Analytics Platform for advanced analytics, machine learning, business intelligence, or ETL tasks, you can use the KNIME Server to extend analytics to your team. Share data, nodes, metanodes, and workflows.

MigrateR Master Node

MigrateR Master Node: MigrateR is a platform for data migrations. It performs readiness assessments, estimates the monthly costs of running on-premises machines in the cloud, visualizes dependencies of on-premises machines, and transfers data to and from the cloud.

One Identity Safeguard for Privileged Sessions

One Identity Safeguard for Privileged Sessions: Safeguard for Privileged Sessions serves as a proxy and inspects protocol traffic on the application level. It can reject any traffic violating the protocol, making it an effective shield against attacks.

Panzura Freedom CloudFS 7.1.4.0 13222

Panzura Freedom CloudFS 7.1.4.0 13222: The Panzura CloudFS™ underpins the Freedom Family and is a scale-out, distributed file system purpose-built for the cloud, incorporating intelligent file services backed by 26 patents.

SecureSphere Database Activity Monitor v13.2

SecureSphere Database Activity Monitor v13.2: SecureSphere Database Activity Monitor audits and protects your business-critical databases on Microsoft Azure.

Snapt - Beta

Snapt: Snapt is a software-only load balancer, web accelerator, and application firewall for DevOps.

SQL Server 2016 SP1 Ent w VulnerabilityAssessment

SQL Server 2016 SP1 Ent w/ VulnerabilityAssessment: This image includes many new database engine features and performance improvements. Utilities like SQL vulnerability assessment through SQL Server Management Studio, VS Code, and FTP client have been provided.

SQL Server 2016 SP1 Std w VulnerabilityAssessment

SQL Server 2016 SP1 Std w/ VulnerabilityAssessment: This image includes many new database engine features and performance improvements. Utilities like SQL vulnerability assessment through SQL Server Management Studio, VS Code, and FTP client have been provided.

SQL Server 2016 SP1 Web w VulnerabilityAssessment

SQL Server 2016 SP1 Web w/ VulnerabilityAssessment: This image includes many database engine features and performance improvements. Utilities like SQL vulnerability assessment through SQL Server Management Studio, VS Code, and FTP client have been provided.

SQL Server 2016 SP2 Ent w VulnerabilityAssessment

SQL Server 2016 SP2 Ent w/ VulnerabilityAssessment: This image includes many new database engine features and performance improvements. Utilities like SQL vulnerability assessment through SQL Server Management Studio, VS Code, and FTP client have been provided.

SQL Server 2016 SP2 Std w VulnerabilityAssessment

SQL Server 2016 SP2 Std w/ VulnerabilityAssessment: This image includes many new database engine features and performance improvements. Utilities like SQL vulnerability assessment through SQL Server Management Studio, VS Code, and FTP client have been provided.

SQL Server 2017 Standard with Debug Utils

SQL Server 2017 Standard with Debug Utils: This image includes many new database engine features and performance improvements. Utilities like SQL vulnerability assessment through SQL Server Management Studio, VS Code, and FTP client have been provided.

SUSE 15 Hardened

SUSE 15 Hardened: Suse 15 is a Linux distribution that aims to provide a free, enterprise-class, community-supported computing platform functionally compatible with its upstream source, Red Hat Enterprise Linux (RHEL).

Ubuntu 18.04 Hardened - Auto Updates  Antivirus

Ubuntu 18.04 Hardened - Auto Updates + Antivirus: Ubuntu is a popular operating system running in hosted environments.

Wallarm - Next-Gen Web Application Firewall (WAF)

Wallarm - Next-Gen Web Application Firewall (WAF): Wallarm automates application protection and security testing for websites, microservices, and APIs. It protects against all types of threats and is reliable in any environment.

Web applications

CloudBlaze

CloudBlaze: CloudBlaze is an automated, configuration-driven data ingestion solution that acts as a data feeder for Azure Data Factory. CloudBlaze facilitates modernization of enterprise data assets to maximize performance and reduce cost.

Starburst Presto for Azure HDInsight

Starburst Presto for Azure HDInsight: Presto is a fast and scalable distributed SQL query engine. Architected for the separation of storage and compute, Presto is perfect for querying data in Azure Blob Storage, Azure Data Lake Storage, and many other services.

Stratis C# Full Node

Stratis C# Full Node: Stratis' enterprise-grade development platform is an end-to-end blockchain solution for native C# and .NET blockchain applications. Our blockchain combines advances in security with breakthroughs in network speed, scalability, and customization.

TIBCO Enterprise Message Service

TIBCO Enterprise Message Service: This template deploys TIBCO Enterprise Message Service on multiple Linux (Red Hat or CentOS) virtual machines in a fault-tolerant configuration.

Veritas Resiliency Platform Gateway Install

Veritas™ Resiliency Platform Gateway Install: Veritas Resiliency Platform provides single-click disaster recovery and migration for any source workload into Azure. This Bring Your Own License (BYOL) version is to install an additional replication gateway.

Veritas Resiliency Platform Resiliency Manager

Veritas Resiliency Platform Resiliency Manager: Veritas Resiliency Platform provides single-click disaster recovery and migration for any source workload into Azure. This Bring Your Own License (BYOL) version is to install an additional resiliency manager.

Container solutions

McAfee Database Security

McAfee Database Security: McAfee® Database Security is an easy-to-deploy and highly scalable software solution that monitors the database management system and protects it from internal and external threats, and even intra-database exploits.

NeuVector Container Security Platform

NeuVector Container Security Platform: NeuVector delivers a highly integrated, automated security platform for Kubernetes and Red Hat OpenShift. Monitor and protect east-west container network traffic.

Your guide to Azure Stack, Azure Data Box, and Avere Ignite sessions

$
0
0

On behalf of the Azure Stack team, I would like to extend our thanks and appreciation to Ignite 2018 attendees for the overwhelming response to our series of Azure Stack, Azure Data Box, and Avere sessions. Now, we’re thrilled to share the sessions on-demand to everyone, everywhere.

Before diving into the session videos below, I encourage you to learn more about the Azure solutions enabling a new era of intelligent cloud and intelligent edge computing. Read an excellent overview from Julia White, Corporate Vice President of Microsoft Azure, and visit our new Future of Cloud webpage.

Azure Stack

These sessions are packed with information, use cases, and guidance to help you understand the principles of Azure Stack. Explore use cases for implementing the solution and get started.

To help you zero in on content relevant to your interests and expertise, follow our two learning paths—four sessions tailored for operators and four sessions for developers (with a special session for CSPs), then explore over 12 additional sessions that drill down into topics that may interest you.

To learn more about Azure Stack, visit the Azure Stack website, view the documentation, download the development kit, and explore real-world stories of Azure Stack in action.

Azure Stack session diagram

Start your learning journey here

Azure Stack overview and roadmap

Watch on-demand | View slide deck

The best-attended Azure Stack session at Ignite is essential viewing. Learn how Azure Stack fits in the Microsoft hybrid cloud strategy, view the product roadmap, and explore solution patterns to get up and running with Azure Stack quickly.

Delivering intelligent edge with Microsoft Azure Stack and Data Box

Watch on-demand| View slide deck

Learn the direction that Microsoft is taking to help customers create adaptive systems that participate in a consistent platform across the edge and cloud.

Operator path

Get started as an Azure Stack operator by watching these four learning sessions—your first step to develop a plan for operationalizing Azure Stack in your environment.

The guide to becoming a Microsoft Azure Stack operator White SUV with trunk open

Watch on-demand | View slide deck

This session provides you the information necessary for being the champion in your organization. This is a prerequisite for anyone wanting to get started with Microsoft Azure Stack.

Discovering the importance of security design principles and key use cases for Azure Stack

Watch on-demand | View slide deck

Learn about the key security design principles and uses cases that Azure Stack enables. This session explores how Assume Breach and Hardened by Default principles give you a secure, hybrid cloud on top which you can build your applications. Also learn how Azure Stack helps you meet the strictest compliance standards.

Best practices for planning Azure Stack deployment and post-deployment integrations with Azure

Watch on-demand | View slide deck

This session guides you through tools to size the solution and the steps to integrate Microsoft Azure Stack into the environment. Learn about typical datacenter integration touchpoints and how to administer the Azure Stack infrastructure, including monitoring and hardware management. Finally, learn how to set up proper plans, offers, and quotas so that your tenants/customers can start consuming the resources hosted in your Azure Stack environment.

Understanding architectural patterns and practices for business continuity and disaster recovery on Microsoft Azure Stack

Watch on-demand | View slide deck

Learn how to recover from catastrophic data loss. Understand the benefits of connecting Azure Stack to Azure Site Recovery and Azure Backup. We discuss both the infrastructure business continuity and disaster recovery (BCDR) as well as application pattern on tenant space.

Developer path

Ready to start developing solutions on the Azure Stack platform? This learning path is for you—a set of informative sessions to accelerate your journey to build intelligent applications everywhere the business needs.

Getting started with Microsoft Azure Stack as a developer

Watch on-demand | View slide deck

Are you a developer wanting to build your applications anywhere? Learn about all the available services in Azure Stack, the Azure SDKs, and how to use PowerShell and CLI for automation. Also learn how to download and install your own Azure Stack for learning, development, and evaluation as well how to leverage your Azure subscription to enable prototyping in Azure.

Understanding hybrid application patterns for Microsoft Azure Stack

Watch on-demand | View slide deck

Hybrid applications are becoming the norm for most organizations today. In this session, learn different application patterns, such as artificial intelligence (AI) patterns, to expedite development of your solutions on Azure Stack and deploy anywhere, whether on Azure Stack or Azure.

Implementing DevOps in Microsoft Azure Stack

Watch on-demand | View slide deck

In this session, learn how to implement a DevOps pipeline for Azure Stack and integrate it with your Azure pipeline. This session also covers the benefits of leveraging infrastructure as code, as well as extending your pipeline to common open source tools.

Accelerate application development through OpenSource frameworks and marketplace items

Watch on-demand

Learn how to leverage popular open source workloads to accelerate your application development. Topics include Cloud Foundry, OpenShift, and Kubernetes, among others.

Azure Data Box Edge

Azure Data Box helps customers move large amounts of data to Microsoft Azure in a cost-effective way. Data Box offline devices easily move data to Azure when busy networks aren’t an option, and Data Box online appliances transfer data to and from Azure over the network.

Watch the informative Ignite 2018 sessions below and visit the Azure Data Box family website to quickly get up to speed on the solution, capabilities, and use cases. Also, watch this short profile of semiconductor innovator Cree to learn how they are using Azure Data Box Edge to scale innovation for its groundbreaking commercial silicon carbide materials.

Using Azure Data Box for moving large volumes of data into AzureAzure Data Box

Watch on-demand | View slide deck

Learn how to use Data Box to quickly get your data to Azure, as well as announcements about the general availability and worldwide rollout. Plus, get insight on the new functionality coming soon.

The new additions to the Azure Data Box family

Watch on-demand | View slide deck

Hear how customers are using Data Box to quickly move data to Azure, as well as an overview of new additions to the Data Box family.

Delivering intelligent edge with Microsoft Azure Stack and Data Box

Watch on-demand | View slide deck

Join Technical Fellow Jeffrey Snover for a discussion about the future of applications purpose-built for the edge. This is where new solutions are needed to put intelligence into action by adapting to changes and continually incorporating new context to help people where they need it, when they need it. Systems designed and deployed across many different devices in the intelligent edge era will require a consistent infrastructure backplane so that they can act as one distributed, intelligent, and adaptive system. Hear about the direction that Microsoft is taking to help customers create adaptive systems that participate in a consistent platform across the edge and cloud.

Data Box Edge and Gateway - Bringing Azure Storage and Compute to the edge

Watch on-demand | View slide deck

Learn how to use two new Azure offerings—Data Box Gateway, an Azure storage gateway, and Data Box Edge, a cloud managed appliance combining the gateway with IoT edge computing—to move data in and out of Azure over your network. Process this data as part of an integrated workflow or for any edge compute tasks where you want easy use of Azure storage. This session covers how these pieces fit together, what problems they solve, and how you can join the preview.

Avere vFXT for Azure

Avere vFXT for Azure provides high-performance file access for high-performance computing (HPC) applications. The solution arms applications owners and computing directors tasked with managing critical HPC workloads with scalability, flexibility, and easy access to cloud and file-based storage locations. Watch the informative session below for an overview, use cases, and a demo. Also, be sure to visit our services page and read “Avere vFXT for Microsoft Azure now in public preview” to learn more.

Running high performance workloads with Avere vFXT for Azure

Watch on-demand | View slide deck

In this technical breakout session, discover how Avere vFXT for Azure enables cloud bursting and lift-and-shift migrations, including a live demonstration. Plus, learn how it works in specific industries, including media rendering, genomics sequencing, oil and gas, Monte Carlo simulations, and financial backtesting.

Announcing .NET Framework 4.8 Early Access build 3673

$
0
0

We are happy to share the next Early Access build for the .NET Framework 4.8. This includes an updated .NET 4.8 runtime as well as the .NET 4.8 Developer Pack (a single package that bundles the .NET Framework 4.8 runtime, the .NET 4.8 Targeting Pack and the .NET Framework 4.8 SDK).

Please help us ensure this is a high quality and compatible release by trying out this build and exploring the new features.

Next steps:
To explore the new features, download the .NET 4.8 Developer Pack build 3673. If you want to try just the .NET 4.8 runtime, you can download either of these:

Please provide your feedback by .NET Framework Early Access GitHub repository.

Please note: This release is still under development and you can expect to see more features and fixes in future preview builds. Also, a reminder that this build is not supported for production use.

This preview build 3673 includes a key improvement/fix  in the WPF area:

        • [WPF] – High DPI Enhancements

You can see the complete list of improvements in this build here .

.NET Framework build 3673 is also included in the next update for Windows 10. You can sign up for Windows Insiders to validate that your applications work great on the latest .NET Framework included in the latest Windows 10 releases.

WPF – High DPI Enhancements

WPF has added support for Per-Monitor V2 DPI Awareness and Mixed-Mode DPI scaling in .NET 4.8. Additional information about these Windows concepts is available here.

The latest Developer Guide for Per monitor application development in WPF states that only pure-WPF applications are expected to work seamlessly in a high-DPI WPF application and that Hosted HWND’s and Windows Forms controls are not fully supported.

.NET 4.8 improves support for hosted HWND’s and Windows Forms interoperation in High-DPI WPF applications on platforms that support Mixed-Mode DPI scaling (Windows 10 v1803). When hosted HWND’s or Windows Forms controls are created as Mixed-Mode DPI scaled windows, (as described in the “Mixed-Mode DPI Scaling and DPI-aware APIs” documentation by calling SetThreadDpiHostingBehavior and SetThreadDpiAwarenessContext API’s), it will be possible to host such content in a Per-Monitor V2 WPF application and have them be sized and scaled appropriately. Such hosted content will not be rendered at the native DPI – instead, the OS will scale the hosted content to the appropriate size.

The support for Per-Monitor v2 DPI awareness mode also allows WPF controls to be hosted (i.e., parented) under a native window in a high-DPI application. Per-Monitor V2 DPI Awareness support will be available on Windows 10 v1607 (Anniversary Update). Windows adds support for child-HWND’s to receive DPI change notifications when Per-Monitor V2 DPI Awareness mode is enabled via the application manifest.

This support is leveraged by WPF to ensure that controls that are hosted under a native window can respond to DPI changes and update themselves. For e.g.- a WPF control hosted in a Windows Forms or a Win32 application that is manifested as Per Monitor V2 – will now be able to respond correctly to DPI changes and update itself.

Note that Windows supports Mixed-Mode DPI scaling on Windows 10 v1803, whereas Per-Monitor V2 is supported on v1607 onwards.

To try out these features, the following application manifest and AppContext flags must be enabled:

1. Enable Per-Monitor DPI in your application

  • Turn on Per-Monitor V2 in your app.manifest

2. Turn on High DPI support in WPF

  • Target .NET Framework 4.6.2 or greater

and

3. Set AppContext switch in your app.config

Alternatively,

Set AppContextSwitch Switch.System.Windows.DoNotUsePresentationDpiCapabilityTier2OrGreater=false in App.Config to enable Per-Monitor V2 and Mixed-Mode DPI support introduced in .NET 4.8.

The runtime section in the final App.Config might look like this:

AppContext switches can also be set in registry. You can refer to the AppContext Class for additional documentation.

Closing
We will continue sharing early builds of the next release of the .NET Framework via the Early Access Program on a regular basis for your feedback. As a member of the .NET Framework Early Access community you play a key role in helping us build new and improved .NET Framework products. We will do our best to ensure these early access builds are stable and compatible, but you may see bugs or issues from time to time. It would help us greatly if you would take the time to report these to us on Github so we can address these issues before the official release.

Thank you!

Microsoft @ DevCon4

$
0
0

At DevCon in 2015, I announced support for Ethereum on Microsoft Azure. It was a humble beginning consisting of a private Ethereum network and a corresponding “Hello World” smart contract. Now in 2018, Azure supports all the major blockchain platforms and can deploy a wide variety of network configurations from a lab environment to more sophisticated consortium networks targeted at production workloads.

Since that time, we have engaged with hundreds of customers across every industry and every corner of the world. We have learned a lot about what business scenarios are viable and what it truly takes to deliver an end to end blockchain solution.

A key learning has been what it takes to build and establish blockchain networks. We strive to improve the overall lifecycle of these networks with better management capabilities, governance frameworks, and integrations with the cloud services needed to build dapps. One of the most impactful observations we have shared with the broader community is a requirement for untrusted networks to support robust applications with multiple participants using varying trust model’s privacy for its participants and potentially outcomes.

The previous statement might seem simple at first, but it’s quite complex if you examine some of the aspects like robust, multiple participants, different trust model, and privacy. Using blockchain technology alone will not address these requirements.

I have previously written extensively on the need for trusted “off-chain” compute in my prior posts about Cryptlets and Enterprise Smart Contracts (ESC). These posts highlighted the use of Trusted Execution Environments (TEEs), or enclaves, to enable multi-party cryptography as well as attested computation that can be implemented natively with blockchains. But we have also found that the use of enclaves can enhance, optimize, and make more robust areas in a blockchain application stack.

For example, the Confidential Consortium Blockchain Framework uses enclaves to optimize the network or data layer. A blockchain node that uses the framework has a strong governance, privacy, and performance framework using enclaves at its base. Additionally, we’re exploring how enclaves can be used to optimize application logic for multi-participant contracts and applications.

As an active participant in the EEA Trusted Compute working group, which this week will release the V.5 of the Trusted Compute Specification, we will continue to explore, collaborate, and share techniques for advancing distributed multi-party applications and including them in our platform in an open way. Along those lines…

At DevCon4 we are pleased to release the Enclave-ready EVM (eEVM)

At DevCon4 in 2018, we are pleased to release an enclave ready Ethereum Virtual Machine. Specifically, this release:

  • Is a C++ implementation of the EVM that can run within a TEE/enclave which has no operating system dependencies and can “run out-of-the-box" in an enclave.
  • Is “gas-less”, i.e., it does not compute gas during contract execution.
  • Supports the Homestead EVM opcodes.
  • Executes existing EVM bytecode.
  • Strictly decouples from storage and the replication engine/blockchain.
  • Could execute in an enclave on the node or off-chain.
  • Released under MIT License.

This EVM is NOT:

  • A new blockchain.
  • A supported product offering.
  • Intended to compete with any implementation.

This contribution demonstrates how a TEEs like Intel® SGX technology can enhance the EVM with confidentiality. We expect that this codebase will be used as a starting point in projects across the ecosystem to have TEE-enabled EVM contract logic work on any blockchain or off-chain compute scenarios. We will continue to share and contribute with the EEA Trusted Compute workgroup to include the capabilities into the open specification and API.

We believe that the eEVM can be complementary and integrate with existing projects like Sawtooth and Burrow with cross blockchain support via ABCI, an abstraction interface that allows for an EVM to connect to any blockchain. The eEVM’s capabilities can also help shape EEA standards for Trusted Compute for maximum interoperability. Partners such as Intel, Truffle, Quorum, and blk.io have already expressed interest in utilizing this technology.

“The open source eEVM and Trusted Execution Environments like Intel(R) Software Guard Extensions can help improve blockchain scalability and privacy by enabling off-chain Ethereum smart contract execution. With the eEVM released to open source and the EEA’s announcement of the Trusted Compute API, off-chain smart contract capability has been extended to any blockchain developer.”

Mike Reed, Director of Intel's Blockchain Program Office

“With the eEVM, Microsoft is open-sourcing a core component of their already-impressive blockchain cloud infrastructure. This means that developers can take advantage of the wide array of existing Ethereum-based smart contracts and build richer applications that solve the issues enterprises face. At Truffle we're excited because the eEVM provides a strong base for future platforms and tools in the enterprise space -- tools we'd love to help develop. Microsoft's strong stance toward interoperability makes this an easy win, and helps make enterprise blockchain developers' lives easier in the process.”

Tim Coulter, CEO & Founder, Truffle Suite

 

“Microsoft are working at the leading edge of blockchain - directly contributing to the ecosystem with initiatives such as the eEVM, Enterprise Smart Contracts and contributions to the Enterprise Ethereum Alliance specifications. We’re proud to be partnering with them and excited about further opportunities for collaboration through web3j for Java developers and our Epirus platform.”

Conor Svensson, CEO & Founder blk.io and Technical Standards Chair, Enterprise Ethereum Alliance

We look forward to seeing the use of the eEVM evolve in the community. To learn more, review the EVM source code on eEVM.

Additional Ethereum community investments

Microsoft has been building with our partners and ISVs since the very early days of Ethereum. These integrations were initially focused on blockchain infrastructure and enabling the creation of private blockchains for labs allowing developers and architects to focus on building applications. Our first partnership was with ConsenSys to create the Ethereum Azure ARM templates and the first blockchain extension for Visual Studio, and released this at our premier developer event, //build.

Since then we have expanded our partner integrations to allow simplified deployment of a wide variety of blockchains and expanded to embrace containerized workloads with Kubernetes integration to be the most open blockchain cloud platform.

Developer productivity and tooling continues to be a pain point in the industry, which we are addressing with partners like Truffle to bridge the backend blockchain directly into the developer experience and IDE, such as VS Code extensions. Truffle boxes can package common applications and samples to execute in Azure Blockchain Workbench. Coming later this week the new integrated developer experience making it dramatically easier for developers to get started building Dapps will be available in the marketplace. 

Blockchain infrastructure templates also have increased in complexity over the years. The use of blockchains has evolved and now provides additional knobs and dials for customers to customize the ledger to fit their use cases targeted for production. Features like pluggable consensus allow customers to choose the right consensus algorithm for their network at deployment. Our latest Quorum offerings include options for both RAFT and iBFT models, while also offering users a choice of privacy model, using either Constellation (Haskell based), or Tessera (Java based).

Additionally, customers have asked to move from establishing networks, to inviting new participants to join existing multi-node networks. This is one of the most challenging aspects of creating a scaled consortium. Leveraging our hyperscale cloud components in Azure for network connectivity, our templates now allow customers to provision new network members selectively with a simple user experience, dramatically lowering the bar for creating real consortiums with multiple business partners.

This week we’re also excited to announce that Pantheon from PegaSys is also now available on Azure. Pantheon is a new type of offering that exposes a public node rather than the private nodes that enterprises have typically provisioned. Leading enterprises with private chains (closed consortiums) have realized that connectivity to multiple chains, including public ones, will be important. Pantheon provides a client that can enable this option for enterprises with a license that is more enterprise friendly. Built on Java, with seamless integration to web3j (the Java blockchain integration library), Pantheon makes it easier than ever for enterprise developers to embrace blockchain with their existing Java programming skills.

“Being available on Microsoft Azure from day one is key for PegaSys' goal to be as easy to deploy as possible. The Azure team has a long history of bringing blockchain to enterprise cloud developers and we're excited to work with Microsoft as they continue to add new tools for enterprises looking to deploy blockchains in production.”

Rob Dawson, Product Lead at PegaSys

Lastly, expanding beyond template and developer experience enhancements, customers have expressed a desire for private consortiums to have a matching decentralized storage companion. IPFS is a clear leader in this space for public Ethereum and has been working on private versions of IPFS for these customers.

We are happy to announce that later this week an IPFS solution will be available on the Azure Marketplace, allowing users to create multi-node storage network in less than 5 minutes. This offering also allows other consortium participants to add more nodes to this private companion network, while respecting the consortium’s authorization components.

Looking forward to a great week in Prague, come see us at the Azure booth and check out @MSFTBlockchain on Twitter for the latest news from Azure Blockchain.

Side by Side user scoped .NET Core installations on Linux with dotnet-install.sh

$
0
0

I can run .NET Core on Windows, Mac, or a dozen Linuxes. On my Ubuntu installation I can check what version I have installed and where it is like this:

$ dotnet --version
2.1.403
$ which dotnet
/usr/bin/dotnet

If we interrogate that dotnet file we see it's a link to elsewhere:

$ ls -alogF /usr/bin/dotnet
lrwxrwxrwx 1 22 Sep 19 03:10 /usr/bin/dotnet -> ../share/dotnet/dotnet*

If we head over there we see similar stuff as we do on Windows.

Side by side DotNet installs

Basically c:program filesdotnet is the same as /share/dotnet.

$ cd ../share/dotnet
$ ll
total 136
drwxr-xr-x 1 root root   4096 Oct  5 19:47 ./
drwxr-xr-x 1 root root   4096 Aug  1 17:44 ../
drwxr-xr-x 1 root root   4096 Feb 13  2018 additionalDeps/
-rwxr-xr-x 1 root root 105704 Sep 19 03:10 dotnet*
drwxr-xr-x 1 root root   4096 Feb 13  2018 host/
-rw-r--r-- 1 root root   1083 Sep 19 03:10 LICENSE.txt
drwxr-xr-x 1 root root   4096 Oct  5 19:48 sdk/
drwxr-xr-x 1 root root   4096 Aug  1 18:07 shared/
drwxr-xr-x 1 root root   4096 Feb 13  2018 store/
-rw-r--r-- 1 root root  27700 Sep 19 03:10 ThirdPartyNotices.txt
$ ls sdk
2.1.4  2.1.403  NuGetFallbackFolder
$ ls shared
Microsoft.AspNetCore.All  Microsoft.AspNetCore.App  Microsoft.NETCore.App
$ ls shared/Microsoft.NETCore.App/
2.0.5  2.1.5

Looking in directories works to figure out what SDKs and Runtime versions are installed, but the best way is to use the dotnet cli itself like this. 

$ dotnet --list-sdks
2.1.4 [/usr/share/dotnet/sdk]
2.1.403 [/usr/share/dotnet/sdk]
$ dotnet --list-runtimes
Microsoft.AspNetCore.All 2.1.5 [/usr/share/dotnet/shared/Microsoft.AspNetCore.All]
Microsoft.AspNetCore.App 2.1.5 [/usr/share/dotnet/shared/Microsoft.AspNetCore.App]
Microsoft.NETCore.App 2.0.5 [/usr/share/dotnet/shared/Microsoft.NETCore.App]
Microsoft.NETCore.App 2.1.5 [/usr/share/dotnet/shared/Microsoft.NETCore.App]

There's great instructions on how to set up .NET Core on your Linux machines via Package Manager here.

Note that these installs of the .NET Core SDK are installed in /usr/share. I can use the dotnet-install.sh to do non-admin installs in my own user directory.

In order to gain more control and do things more manually, you can use this shell script here: https://dot.net/v1/dotnet-install.sh and its documentation is here at docs. For Windows there is also a PowerShell version https://dot.net/v1/dotnet-install.ps1

The main usefulness of these scripts is in automation scenarios and non-admin installations. There are two scripts: One is a PowerShell script that works on Windows. The other script is a bash script that works on Linux/macOS. Both scripts have the same behavior. The bash script also reads PowerShell switches, so you can use PowerShell switches with the script on Linux/macOS systems.

For example, I can see all the current .NET Core 2.1 versions at https://www.microsoft.com/net/download/dotnet-core/2.1 and 2.2 at https://www.microsoft.com/net/download/dotnet-core/2.2 - the URL format is regular. I can see from that page that at the time of this blog post, v2.1.5 is both Current (most recent stable) and also LTS (Long Term Support).

I'll grab the install script and chmod +x it. Running it with no options will get me the latest LTS release.

$ wget https://dot.net/v1/dotnet-install.sh
--2018-10-31 15:41:08--  https://dot.net/v1/dotnet-install.sh
Resolving dot.net (dot.net)... 104.214.64.238
Connecting to dot.net (dot.net)|104.214.64.238|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 30602 (30K) [application/x-sh]
Saving to: ‘dotnet-install.sh’

I like the "-DryRun" option because it will tell you what WILL happen without doing it.

$ ./dotnet-install.sh -DryRun
dotnet-install: Payload URL: https://dotnetcli.azureedge.net/dotnet/Sdk/2.1.403/dotnet-sdk-2.1.403-linux-x64.tar.gz
dotnet-install: Legacy payload URL: https://dotnetcli.azureedge.net/dotnet/Sdk/2.1.403/dotnet-dev-ubuntu.16.04-x64.2.1.403.tar.gz
dotnet-install: Repeatable invocation: ./dotnet-install.sh --version 2.1.403 --channel LTS --install-dir <auto>

If I use the dotnet-install script can have multiple copies of the .NET Core SDK installed in my user folder at ~/.dotnet. It all depends on your PATH. Note this as I use ~/.dotnet for my .NET Core install location and run dotnet --list-sdks. Make sure you know what your PATH is and that you're getting the .NET Core you expect for your user.

$ which dotnet
/usr/bin/dotnet
$ export PATH=/home/scott/.dotnet:$PATH
$ which dotnet
/home/scott/.dotnet/dotnet
$ dotnet --list-sdks
2.1.402 [/home/scott/.dotnet/sdk]

Now I will add a few more .NET Core SDKs side-by-side with the dotnet-install.sh script. Remember again, these aren't .NET's installed with apt-get which would be system level and by run with sudo. These are user profile installed versions.

There's really no reason to do side by side at THIS level of granularity, but it makes the point.

$ dotnet --list-sdks
2.1.302 [/home/scott/.dotnet/sdk]
2.1.400 [/home/scott/.dotnet/sdk]
2.1.401 [/home/scott/.dotnet/sdk]
2.1.402 [/home/scott/.dotnet/sdk]
2.1.403 [/home/scott/.dotnet/sdk]

When you're doing your development, you can use "dotnet new globaljson" and have each path/project request a specific SDK version.

$ dotnet new globaljson
The template "global.json file" was created successfully.
$ cat global.json
{
  "sdk": {
    "version": "2.1.403"
  }
}

Hope this helps!


Sponsor: Reduce time to market and simplify IOT development using developer kits built on Intel Atom®, Intel® Core™ and Intel® Xeon® processors and tools such as Intel® System Studio and Arduino Create*



© 2018 Scott Hanselman. All rights reserved.
     
Viewing all 10804 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>