Quantcast
Channel: Category Name
Viewing all 10804 articles
Browse latest View live

Windows 10 SDK Preview Build 17677 available now!

$
0
0

Today, we released a new Windows 10 Preview Build of the SDK to be used in conjunction with Windows 10 Insider Preview (Build 17677 or greater). The Preview SDK Build 17677 contains bug fixes and under development changes to the API surface area.

The Preview SDK can be downloaded from developer section on Windows Insider.

For feedback and updates to the known issues, please see the developer forum. For new developer feature requests, head over to our Windows Platform UserVoice.

Things to note:

  • This build works in conjunction with previously released SDKs and Visual Studio 2017. You can install this SDK and still also continue to submit your apps that target Windows 10 Creators build or earlier to the store.
  • The Windows SDK will now formally only be supported by Visual Studio 2017 and greater. You can download the Visual Studio 2017 here.
  • This build of the Windows SDK will install on Windows 10 Insider Preview and supported Windows operating systems.

Known Issues

Installation on an operating system that is not a Windows 10 Insider Preview  build is not supported and may fail.

Windows Device Portal

Please note that there is a known issue in this Windows Insider build that prevents the user from enabling Developer Mode through the For developers settings page.

Unfortunately, this means that you will not be able to remotely deploy a UWP application to your PC or use Windows Device Portal on this build. There are no known workarounds at the moment. Please skip this flight if you rely on these features.

Missing Contract File

The contract Windows.System.SystemManagementContract is not included in this release. In order to access the following APIs, please use a previous Windows IoT extension SDK with your project. This bug will be fixed in a future preview build of the SDK.

The following APIs are affected by this bug:


namespace Windows.Services.Cortana {
  public sealed class CortanaSettings     
}
namespace Windows.System {
  public enum AutoUpdateTimeZoneStatus
  public static class DateTimeSettings
  public enum PowerState
  public static class ProcessLauncher
  public sealed class ProcessLauncherOptions
  public sealed class ProcessLauncherResult
  public enum ShutdownKind
  public static class ShutdownManager
  public struct SystemManagementContract
  public static class TimeZoneSettings
}

API Spot Light:

Check out LauncherOptions.GroupingPreference.


namespace Windows.System {
  public sealed class FolderLauncherOptions : ILauncherViewOptions {
    ViewGrouping GroupingPreference { get; set; }
  }
  public sealed class LauncherOptions : ILauncherViewOptions {
    ViewGrouping GroupingPreference { get; set; }
  }

This release contains the new LauncherOptions.GroupingPreference property to assist your app in tailoring its behavior for Sets. Watch the presentation here.

What’s New:

MC.EXE

We’ve made some important changes to the C/C++ ETW code generation of mc.exe (Message Compiler):

The “-mof” parameter is deprecated. This parameter instructs MC.exe to generate ETW code that is compatible with Windows XP and earlier. Support for the “-mof” parameter will be removed in a future version of mc.exe.

As long as the “-mof” parameter is not used, the generated C/C++ header is now compatible with both kernel-mode and user-mode, regardless of whether “-km” or “-um” was specified on the command line. The header will use the _ETW_KM_ macro to automatically determine whether it is being compiled for kernel-mode or user-mode and will call the appropriate ETW APIs for each mode.

  • The only remaining difference between “-km” and “-um” is that the EventWrite[EventName] macros generated with “-km” have an Activity ID parameter while the EventWrite[EventName] macros generated with “-um” do not have an Activity ID parameter.

The EventWrite[EventName] macros now default to calling EventWriteTransfer (user mode) or EtwWriteTransfer (kernel mode). Previously, the EventWrite[EventName] macros defaulted to calling EventWrite (user mode) or EtwWrite (kernel mode).

  • The generated header now supports several customization macros. For example, you can set the MCGEN_EVENTWRITETRANSFER macro if you need the generated macros to call something other than EventWriteTransfer.
  • The manifest supports new attributes.
    • Event “name”: non-localized event name.
    • Event “attributes”: additional key-value metadata for an event such as filename, line number, component name, function name.
    • Event “tags”: 28-bit value with user-defined semantics (per-event).
    • Field “tags”: 28-bit value with user-defined semantics (per-field – can be applied to “data” or “struct” elements).
  • You can now define “provider traits” in the manifest (e.g. provider group). If provider traits are used in the manifest, the EventRegister[ProviderName] macro will automatically register them.
  • MC will now report an error if a localized message file is missing a string. (Previously MC would silently generate a corrupt message resource.)
  • MC can now generate Unicode (utf-8 or utf-16) output with the “-cp utf-8” or “-cp utf-16” parameters.

API Updates and Additions

When targeting new APIs, consider writing your app to be adaptive in order to run correctly on the widest number of Windows 10 devices. Please see Dynamically detecting features with API contracts (10 by 10) for more information.

The following APIs have been added to the platform since the release of 17134.


namespace Windows.ApplicationModel {
  public sealed class AppInstallerFileInfo
  public sealed class LimitedAccessFeatureRequestResult
  public static class LimitedAccessFeatures
  public enum LimitedAccessFeatureStatus
  public sealed class Package {
    IAsyncOperation<PackageUpdateAvailabilityResult> CheckUpdateAvailabilityAsync();
    AppInstallerFileInfo GetAppInstallerFileInfo();
  }
  public enum PackageUpdateAvailability
  public sealed class PackageUpdateAvailabilityResult
}
namespace Windows.ApplicationModel.Calls {
  public sealed class VoipCallCoordinator {
    IAsyncOperation<VoipPhoneCallResourceReservationStatus> ReserveCallResourcesAsync();
  }
}
namespace Windows.ApplicationModel.Store.Preview {
  public static class StoreConfiguration {
    public static bool IsPinToDesktopSupported();
    public static bool IsPinToStartSupported();
    public static bool IsPinToTaskbarSupported();
    public static void PinToDesktop(string appPackageFamilyName);
    public static void PinToDesktopForUser(User user, string appPackageFamilyName);
  }
}
namespace Windows.ApplicationModel.Store.Preview.InstallControl {
  public enum AppInstallationToastNotificationMode
  public sealed class AppInstallItem {
    AppInstallationToastNotificationMode CompletedInstallToastNotificationMode { get; set; }
    AppInstallationToastNotificationMode InstallInProgressToastNotificationMode { get; set; }
    bool PinToDesktopAfterInstall { get; set; }
    bool PinToStartAfterInstall { get; set; }
    bool PinToTaskbarAfterInstall { get; set; }
  }
  public sealed class AppInstallManager {
    bool CanInstallForAllUsers { get; }
  }
  public sealed class AppInstallOptions {
    AppInstallationToastNotificationMode CompletedInstallToastNotificationMode { get; set; }
    bool InstallForAllUsers { get; set; }
    AppInstallationToastNotificationMode InstallInProgressToastNotificationMode { get; set; }
    bool PinToDesktopAfterInstall { get; set; }
    bool PinToStartAfterInstall { get; set; }
    bool PinToTaskbarAfterInstall { get; set; }
    bool StageButDoNotInstall { get; set; }
  }
  public sealed class AppUpdateOptions {
    bool AutomaticallyDownloadAndInstallUpdateIfFound { get; set; }
  }
}
namespace Windows.Devices.Enumeration {
  public sealed class DeviceInformation {
    string ContainerDeviceId { get; }
    DevicePhysicalInfo PhysicalInfo { get; }
  }
  public enum DeviceInformationKind {
    DevicePanel = 8,
  }
  public sealed class DeviceInformationPairing {
    public static bool TryRegisterForAllInboundPairingRequestsWithProtectionLevel(DevicePairingKinds pairingKindsSupported, DevicePairingProtectionLevel minProtectionLevel);
  }
  public sealed class DevicePhysicalInfo
  public enum PanelDeviceShape
}
namespace Windows.Devices.Enumeration.Pnp {
  public enum PnpObjectType {
    DevicePanel = 8,
  }
}
namespace Windows.Devices.Lights {
  public sealed class LampArray
  public enum LampArrayKind
  public sealed class LampInfo
  public enum LampPurposes : uint
}
namespace Windows.Devices.Lights.Effects {
  public interface ILampArrayEffect
  public sealed class LampArrayBitmapEffect : ILampArrayEffect
  public sealed class LampArrayBitmapRequestedEventArgs
  public sealed class LampArrayBlinkEffect : ILampArrayEffect
  public sealed class LampArrayColorRampEffect : ILampArrayEffect
  public sealed class LampArrayCustomEffect : ILampArrayEffect
  public enum LampArrayEffectCompletionBehavior
  public sealed class LampArrayEffectPlaylist : IIterable<ILampArrayEffect>, IVectorView<ILampArrayEffect>
  public enum LampArrayEffectStartMode
  public enum LampArrayRepetitionMode
  public sealed class LampArraySolidEffect : ILampArrayEffect
  public sealed class LampArrayUpdateRequestedEventArgs
}
namespace Windows.Devices.Sensors {
  public sealed class SimpleOrientationSensor {
    public static IAsyncOperation<SimpleOrientationSensor> FromIdAsync(string deviceId);
    public static string GetDeviceSelector();
  }
}
namespace Windows.Devices.SmartCards {
  public static class KnownSmartCardAppletIds
  public sealed class SmartCardAppletIdGroup {
    string Description { get; set; }
    IRandomAccessStreamReference Logo { get; set; }
    ValueSet Properties { get; }
    bool SecureUserAuthenticationRequired { get; set; }
  }
  public sealed class SmartCardAppletIdGroupRegistration {
    string SmartCardReaderId { get; }
    IAsyncAction SetPropertiesAsync(ValueSet props);
  }
}
namespace Windows.Devices.WiFi {
  public enum WiFiPhyKind {
    He = 10,
  }
}
namespace Windows.Graphics.Capture {
  public sealed class GraphicsCaptureItem {
    public static GraphicsCaptureItem CreateFromVisual(Visual visual);
  }
}
namespace Windows.Graphics.Imaging {
  public sealed class BitmapDecoder : IBitmapFrame, IBitmapFrameWithSoftwareBitmap {
    public static Guid HeifDecoderId { get; }
    public static Guid WebpDecoderId { get; }
  }
  public sealed class BitmapEncoder {
    public static Guid HeifEncoderId { get; }
  }
}
namespace Windows.Management {
  public static class MdmRegistrationManager
}
namespace Windows.Management.Deployment {
  public enum DeploymentOptions : uint {
    ForceUpdateFromAnyVersion = (uint)262144,
  }
  public sealed class PackageManager {
    IAsyncOperationWithProgress<DeploymentResult, DeploymentProgress> DeprovisionPackageForAllUsersAsync(string packageFamilyName);
  }
  public enum RemovalOptions : uint {
    RemoveForAllUsers = (uint)524288,
  }
}
namespace Windows.Management.Policies {
  public static class NamedPolicy {
    public static void SetPolicyAtPath(string accountId, string area, string name, NamedPolicyData2 value);
    public static void SetPolicyAtPathForUser(User user, string accountId, string area, string name, NamedPolicyData2 value);
  }
  public sealed class NamedPolicyData2
}
namespace Windows.Media.Core {
  public sealed class MediaStreamSample {
    IDirect3DSurface Direct3D11Surface { get; }
    public static MediaStreamSample CreateFromDirect3D11Surface(IDirect3DSurface surface, TimeSpan timestamp);
  }
}
namespace Windows.Media.Devices.Core {
  public sealed class CameraIntrinsics {
    public CameraIntrinsics(Vector2 focalLength, Vector2 principalPoint, Vector3 radialDistortion, Vector2 tangentialDistortion, uint imageWidth, uint imageHeight);
  }
}
namespace Windows.Media.MediaProperties {
  public sealed class ImageEncodingProperties : IMediaEncodingProperties {
    public static ImageEncodingProperties CreateHeif();
  }
  public static class MediaEncodingSubtypes {
    public static string Heif { get; }
  }
}
namespace Windows.Media.Streaming.Adaptive {
  public enum AdaptiveMediaSourceResourceType {
    MediaSegmentIndex = 5,
  }
}
namespace Windows.Security.Authentication.Web.Provider {
  public sealed class WebAccountProviderInvalidateCacheOperation : IWebAccountProviderBaseReportOperation, IWebAccountProviderOperation
  public enum WebAccountProviderOperationKind {
    InvalidateCache = 7,
  }
  public sealed class WebProviderTokenRequest {
    string Id { get; }
  }
}
namespace Windows.Security.DataProtection {
  public enum UserDataAvailability
  public sealed class UserDataAvailabilityStateChangedEventArgs
  public sealed class UserDataBufferUnprotectResult
  public enum UserDataBufferUnprotectStatus
  public sealed class UserDataProtectionManager
  public sealed class UserDataStorageItemProtectionInfo
  public enum UserDataStorageItemProtectionStatus
}
namespace Windows.Services.Cortana {
  public sealed class CortanaActionableInsights
  public sealed class CortanaActionableInsightsOptions
}
namespace Windows.Services.Store {
  public sealed class StoreContext {
    IAsyncOperation<StoreRateAndReviewResult> RequestRateAndReviewAppAsync();
    IAsyncOperation<IVectorView<StoreQueueItem>> SetInstallOrderForAssociatedStoreQueueItemsAsync(IIterable<StoreQueueItem> items);
  }
  public sealed class StoreQueueItem {
    IAsyncAction CancelInstallAsync();
    IAsyncAction PauseInstallAsync();
    IAsyncAction ResumeInstallAsync();
  }
  public sealed class StoreRateAndReviewResult
  public enum StoreRateAndReviewStatus
}
namespace Windows.Storage.Provider {
  public enum StorageProviderHydrationPolicyModifier : uint {
    AutoDehydrationAllowed = (uint)4,
  }
}
namespace Windows.System {
  public sealed class FolderLauncherOptions : ILauncherViewOptions {
    ViewGrouping GroupingPreference { get; set; }
  }
  public static class Launcher {
    public static IAsyncOperation<bool> LaunchFolderPathAsync(string path);
    public static IAsyncOperation<bool> LaunchFolderPathAsync(string path, FolderLauncherOptions options);
    public static IAsyncOperation<bool> LaunchFolderPathForUserAsync(User user, string path);
    public static IAsyncOperation<bool> LaunchFolderPathForUserAsync(User user, string path, FolderLauncherOptions options);
  }
  public sealed class LauncherOptions : ILauncherViewOptions {
    ViewGrouping GroupingPreference { get; set; }
  }
namespace Windows.System.UserProfile {
  public sealed class AssignedAccessSettings
}
namespace Windows.UI.Composition {
  public sealed class AnimatablePropertyInfo : CompositionObject
  public enum AnimationPropertyAccessMode
  public enum AnimationPropertyType
  public class CompositionAnimation : CompositionObject, ICompositionAnimationBase {
    void SetAnimatableReferenceParameter(string parameterName, IAnimatable source);
  }
  public enum CompositionBatchTypes : uint {
    AllAnimations = (uint)5,
    InfiniteAnimation = (uint)4,
  }
  public sealed class CompositionGeometricClip : CompositionClip
  public class CompositionObject : IAnimatable, IClosable {
    void GetPropertyInfo(string propertyName, AnimatablePropertyInfo propertyInfo);
  }
  public sealed class Compositor : IClosable {
    CompositionGeometricClip CreateGeometricClip();
  }
  public interface IAnimatable
}
namespace Windows.UI.Composition.Interactions {
  public sealed class InteractionTracker : CompositionObject {
    IReference<float> PositionDefaultAnimationDurationInSeconds { get; set; }
    IReference<float> ScaleDefaultAnimationDurationInSeconds { get; set; }
    int TryUpdatePositionWithDefaultAnimation(Vector3 value);
    int TryUpdateScaleWithDefaultAnimation(float value, Vector3 centerPoint);
  }
}
namespace Windows.UI.Notifications {
  public sealed class ScheduledToastNotification {
    public ScheduledToastNotification(DateTime deliveryTime);
    IAdaptiveCard AdaptiveCard { get; set; }
  }
  public sealed class ToastNotification {
    public ToastNotification();
    IAdaptiveCard AdaptiveCard { get; set; }
  }
}
namespace Windows.UI.Shell {
  public sealed class TaskbarManager {
    IAsyncOperation<bool> IsSecondaryTilePinnedAsync(string tileId);
    IAsyncOperation<bool> RequestPinSecondaryTileAsync(SecondaryTile secondaryTile);
    IAsyncOperation<bool> TryUnpinSecondaryTileAsync(string tileId);
  }
}
namespace Windows.UI.StartScreen {
  public sealed class StartScreenManager {
    IAsyncOperation<bool> ContainsSecondaryTileAsync(string tileId);
    IAsyncOperation<bool> TryRemoveSecondaryTileAsync(string tileId);
  }
}
namespace Windows.UI.ViewManagement {
  public sealed class ApplicationView {
    bool IsTabGroupingSupported { get; }
  }
  public sealed class ApplicationViewTitleBar {
    void SetActiveIconStreamAsync(RandomAccessStreamReference activeIcon);
  }
  public enum ApplicationViewWindowingMode {
    CompactOverlay = 3,
    Maximized = 4,
  }
  public enum ViewGrouping
  public sealed class ViewModePreferences {
    ViewGrouping GroupingPreference { get; set; }
  }
}
namespace Windows.UI.ViewManagement.Core {
  public sealed class CoreInputView {
    bool TryHide();
    bool TryShow();
    bool TryShow(CoreInputViewKind type);
  }
  public enum CoreInputViewKind
}
namespace Windows.UI.Xaml.Automation {
  public sealed class AutomationElementIdentifiers {
    public static AutomationProperty IsDialogProperty { get; }
  }
  public sealed class AutomationProperties {
    public static DependencyProperty IsDialogProperty { get; }
    public static bool GetIsDialog(DependencyObject element);
    public static void SetIsDialog(DependencyObject element, bool value);
  }
}
namespace Windows.UI.Xaml.Automation.Peers {
  public class AppBarButtonAutomationPeer : ButtonAutomationPeer, IExpandCollapseProvider {
    ExpandCollapseState ExpandCollapseState { get; }
    void Collapse();
    void Expand();
  }
  public class AutomationPeer : DependencyObject {
    bool IsDialog();
    virtual bool IsDialogCore();
  }
}
namespace Windows.UI.Xaml.Controls {
  public class AppBarElementContainer : ContentControl, ICommandBarElement, ICommandBarElement2
  public enum BackgroundSizing
  public sealed class Border : FrameworkElement {
    BackgroundSizing BackgroundSizing { get; set; }
    public static DependencyProperty BackgroundSizingProperty { get; }
  }
  public class ContentPresenter : FrameworkElement {
    BackgroundSizing BackgroundSizing { get; set; }
    public static DependencyProperty BackgroundSizingProperty { get; }
  }
  public class Control : FrameworkElement {
    BackgroundSizing BackgroundSizing { get; set; }
    public static DependencyProperty BackgroundSizingProperty { get; }
    CornerRadius CornerRadius { get; set; }
    public static DependencyProperty CornerRadiusProperty { get; }
    bool UseSystemValidationVisuals { get; set; }
    public static DependencyProperty UseSystemValidationVisualsProperty { get; }
  }
  public class DropDownButton : Button
  public class Grid : Panel {
    BackgroundSizing BackgroundSizing { get; set; }
    public static DependencyProperty BackgroundSizingProperty { get; }
  }
  public class NavigationView : ContentControl {
    bool IsTopNavigationForcedHidden { get; set; }
    NavigationViewOrientation Orientation { get; set; }
    UIElement TopNavigationContentOverlayArea { get; set; }
    UIElement TopNavigationLeftHeader { get; set; }
    UIElement TopNavigationMiddleHeader { get; set; }
    UIElement TopNavigationRightHeader { get; set; }
  }
  public enum NavigationViewOrientation
  public sealed class PasswordBox : Control {
    bool CanPasteClipboardContent { get; }
    public static DependencyProperty CanPasteClipboardContentProperty { get; }
    void PasteFromClipboard();
  }
  public class RelativePanel : Panel {
    BackgroundSizing BackgroundSizing { get; set; }
    public static DependencyProperty BackgroundSizingProperty { get; }
  }
  public class RichEditBox : Control {
    RichEditTextDocument RichEditDocument { get; }
    FlyoutBase SelectionFlyout { get; set; }
    public static DependencyProperty SelectionFlyoutProperty { get; }
    event TypedEventHandler<RichEditBox, RichEditBoxSelectionChangingEventArgs> SelectionChanging;
  }
  public sealed class RichEditBoxSelectionChangingEventArgs
  public sealed class RichTextBlock : FrameworkElement {
    void CopySelectionToClipboard();
  }
  public class StackPanel : Panel, IInsertionPanel, IScrollSnapPointsInfo {
    BackgroundSizing BackgroundSizing { get; set; }
    public static DependencyProperty BackgroundSizingProperty { get; }
  }
  public sealed class TextBlock : FrameworkElement {
    void CopySelectionToClipboard();
  }
  public class TextBox : Control {
    bool CanPasteClipboardContent { get; }
    public static DependencyProperty CanPasteClipboardContentProperty { get; }
    bool CanRedo { get; }
    public static DependencyProperty CanRedoProperty { get; }
    bool CanUndo { get; }
    public static DependencyProperty CanUndoProperty { get; }
    FlyoutBase SelectionFlyout { get; set; }
    public static DependencyProperty SelectionFlyoutProperty { get; }
    event TypedEventHandler<TextBox, TextBoxSelectionChangingEventArgs> SelectionChanging;
    void CopySelectionToClipboard();
    void CutSelectionToClipboard();
    void PasteFromClipboard();
    void Redo();
    void Undo();
  }
  public sealed class TextBoxSelectionChangingEventArgs
  public class TreeView : Control {
    bool CanDragItems { get; set; }
    public static DependencyProperty CanDragItemsProperty { get; }
    bool CanReorderItems { get; set; }
    public static DependencyProperty CanReorderItemsProperty { get; }
    Style ItemContainerStyle { get; set; }
    public static DependencyProperty ItemContainerStyleProperty { get; }
    StyleSelector ItemContainerStyleSelector { get; set; }
    public static DependencyProperty ItemContainerStyleSelectorProperty { get; }
    TransitionCollection ItemContainerTransitions { get; set; }
    public static DependencyProperty ItemContainerTransitionsProperty { get; }
    DataTemplate ItemTemplate { get; set; }
    public static DependencyProperty ItemTemplateProperty { get; }
    DataTemplateSelector ItemTemplateSelector { get; set; }
    public static DependencyProperty ItemTemplateSelectorProperty { get; }
    event TypedEventHandler<TreeView, TreeViewDragItemsCompletedEventArgs> DragItemsCompleted;
    event TypedEventHandler<TreeView, TreeViewDragItemsStartingEventArgs> DragItemsStarting;
  }
  public sealed class TreeViewDragItemsCompletedEventArgs
  public sealed class TreeViewDragItemsStartingEventArgs
  public sealed class WebView : FrameworkElement {
    event TypedEventHandler<WebView, WebViewWebResourceRequestedEventArgs> WebResourceRequested;
  }
  public sealed class WebViewWebResourceRequestedEventArgs
}
namespace Windows.UI.Xaml.Controls.Primitives {
  public class FlyoutBase : DependencyObject {
    bool IsOpen { get; }
    FlyoutShowMode ShowMode { get; set; }
    public static DependencyProperty ShowModeProperty { get; }
    public static DependencyProperty TargetProperty { get; }
    void Show(FlyoutShowOptions showOptions);
  }
  public enum FlyoutPlacementMode {
    BottomLeftJustified = 7,
    BottomRightJustified = 8,
    LeftBottomJustified = 10,
    LeftTopJustified = 9,
    RightBottomJustified = 12,
    RightTopJustified = 11,
    TopLeftJustified = 5,
    TopRightJustified = 6,
  }
  public enum FlyoutShowMode
  public sealed class FlyoutShowOptions : DependencyObject
}
namespace Windows.UI.Xaml.Hosting {
  public sealed class XamlBridge : IClosable
}
namespace Windows.UI.Xaml.Input {
  public sealed class FocusManager {
    public static event EventHandler<GettingFocusEventArgs> GettingFocus;
    public static event EventHandler<FocusManagerGotFocusEventArgs> GotFocus;
    public static event EventHandler<LosingFocusEventArgs> LosingFocus;
    public static event EventHandler<FocusManagerLostFocusEventArgs> LostFocus;
  }
  public sealed class FocusManagerGotFocusEventArgs
  public sealed class FocusManagerLostFocusEventArgs
  public sealed class GettingFocusEventArgs : RoutedEventArgs {
    Guid CorrelationId { get; }
  }
  public sealed class LosingFocusEventArgs : RoutedEventArgs {
    Guid CorrelationId { get; }
  }
}
namespace Windows.UI.Xaml.Markup {
  public sealed class FullXamlMetadataProviderAttribute : Attribute
  public interface IXamlBindScopeDiagnostics
}
namespace Windows.UI.Xaml.Media.Animation {
  public class BasicConnectedAnimationConfiguration : ConnectedAnimationConfiguration
  public sealed class BrushTransition : Transition
  public sealed class ConnectedAnimation {
    ConnectedAnimationConfiguration Configuration { get; set; }
  }
  public class ConnectedAnimationConfiguration
  public class DirectConnectedAnimationConfiguration : ConnectedAnimationConfiguration
  public class GravityConnectedAnimationConfiguration : ConnectedAnimationConfiguration
  public enum SlideNavigationTransitionEffect
  public sealed class SlideNavigationTransitionInfo : NavigationTransitionInfo {
    SlideNavigationTransitionEffect Effect { get; set; }
    public static DependencyProperty EffectProperty { get; }
  }
}
namespace Windows.Web.UI.Interop {
  public sealed class WebViewControl : IWebViewControl {
    event TypedEventHandler<WebViewControl, object> GotFocus;
    event TypedEventHandler<WebViewControl, object> LostFocus;
  }
  public sealed class WebViewControlProcess {
    string Partition { get; }
  }
  public sealed class WebViewControlProcessOptions {
    string Partition { get; set; }
  }
}

The post Windows 10 SDK Preview Build 17677 available now! appeared first on Windows Developer Blog.


Securing an Azure App Service Website under SSL in minutes with Let’s Encrypt

$
0
0

A screenshot that says "Your connection to this site is not secure."Let’s Encrypt is a free, automated, and open Certificate Authority. That means you can get free SSL certs and change your sites from http:// to https://. What's the catch? The SSL Certificates only last 90 days - not a year or years. They do this to encourage automation. If you set this up, you'll want to have some scripts or background process to automatically renew and install the certificates.

I run nearly two dozen websites (some small, some significant) on Azure. Given that Chrome 68+ is going to call out non HTTPS sites explicitly as "Not secure" in July, now's as good a time as any for us to get our sites - large and small - encrypted. I have some small static "brochure-ware" sites like http://babysmash.com that just aren't worth the money for a cert. Now it's free, so let's do it.

In some theorectical future, I hope that Azure and Clouds like it will have a single "encrypt it" button and handle the details for us, but as of the date of this blog post, there's some manual initial setup and some great work from the community.

There's a protocol for getting certificates called "ACME" - Automated Certificate Management Environment - and the EFF has a tool called Certbot that helps you request and deploy certs. There is a whole ecosystem around it, and if you are running Windows/IIS you can use a great simple ACME client called "Win-ACME." There are also UI's like Certify SSL Manager and PowerShell commands for ACME and systems like "GetSSL - Azure Automation," so you can feel free to roll your own script in an afternoon. Again, if you have a Windows VM and IIS, it's pretty straightforward and getting easier every day.

I'm currently using Simon J.K. Pedersen's lovely (and volunteer and unsupported, so be nice) Azure Let's Encrypt Web App Site Extension. I followed the instructions here but hit a few snags and a few things that aren't totally obvious. Many kudos and thanks to Simon for his hard work on this, as he's saving us all many hours of trouble!

Securing an Azure Web App with Let's Encrypt and the (unofficial) SJKP Let's Encrypt Site Extension

I'll go and secure BabySmash.com right now. Make a text file and keep track of these few things.

What's our checklist?

  • Azure Storage connection string - You'll need one for the extension to store state.
  • App Service Hosting Plan and App Service Resource Group Name - Ideally your "plan" (the VM your site runs on) and your site are in the same Resource Group (a resource group is just a name for a pile of stuff)
  • Service Principal Client/Application ID - This is like an account that the Site Extension will run as to do its job. It's an "on behalf of" delegate that will automate the changes to your site. You might see "client id" or "application id," they are the same thing.
  • Service Principal Client Secret - You'll make a new Key in your Service Principal. I called mine "login" but it doesn't matter, then some value like a generated password (also doesn't matter) and then hit Save. You'll then get a long hashed value - THAT is your Client Secret. Save it, you'll never see it again and you can't get it back.

Cool. Let's do it. Again, following along with the wiki, I'll make an App under Active Directory | App Registrations in the Azure Portal at https://portal.azure.com

Add a new App Registration in the Azure Portal

Make a new app...

Creating a new App Registration

Now grab the Application ID, aka Client ID and save that in your scratch space/notepad/sticky note/smart brain/don't lose it.

Copying the App Registration ClientID

Now click Settings, Keys, make a new one called "login" with a password and click Save. COPY THAT VALUE. You'll never see it again.

Adding a Key to the App Registration

Now, go to the Resource Group for your App Service and App Service Plan. Ideally it'll be the same one, but if it's not, go to each one and keep track of the names. I went there with the search box at the top of the Azure Portal.

Going to the Resource Group

The Portal changes sometimes, and this next step didn't line up to the Wiki instructions exactly. Click add, then make your new App Registration from above a "Contributor" to your Resource Group.

Adding the App as a Contributor to the Resource Group

Now head over to your actual App Service, and click Extensions.

App Service Extentions

I picked Azure Let's Encrypt to have this run as a Web Job in the background.

Adding the Let's Encrypt App Service Extension

Now, while you're at your Web App/Site, go to Settings and make sure you've set the following two Connection strings AzureWebJobsDashboard and AzureWebJobsStorage - Don't forget this step or it'll all work once but fail in 3 months during the renewal.

Both of these should be set to your Azure Storage Account connection string, e.g. DefaultEndpointsProtocol=https;AccountName=[myaccount];AccountKey=[mykey];

Remember the Web Job needs this storage so it can renew the certs every 3 months. Add them as "Custom."

Connection Strings in App Settings

Next, the instructions say to "configure the Site Extension." That can be confusing until you realize a Site Extension is really a "Side car web site." It is its own little website, running off to the side of your site. It will be at http://YOURSITENAME.scm.azurewebsites.net/LetsEncrypt so mine is at http://babysmash.scm.azurewebsites.net/LetsEncrypt.

You'll then want to full this form out. Your "Tenant ID" is your Azure Active Directory URL. You'll find your SubscriptionId in the "Overview" tab.

Configuring the Let's Encrypt Extension

Next next, and then hold down CTRL (as this is a multi-selection dialog) and pick the sites you want a certificate for. Note that www.yourdomin and and .yourdomain (the naked domain) are two different certs.

Requesting two SSL Certs

You'll want to confirm you see "Certificate successfully installed."

Certificate successfully installed.

Then head back over to the Azure Portal and turn on HTTPS Only if you'd like Azure itself (versus your code) to ensure and redirect all non-secure links to https://. Also confirm your SSL Bindings are correct. They should have been set up automatically.

HTTPS Only in the Azure Portal

Now I'll go hit https://babysmash.com and...

A screenshot that says "Your connection to this site is not secure."

It's not secure! Ah, now my site is in "mixed mode." That means that some of the resources like gifs or css were fetched with non-ssl (HTTP://) links. I'll update my site and all its external resources like YouTube embeds and fonts with https:// so that everything is secure. Since I'm using Git Deploy with Azure Web Apps (Azure App Service) I'll just make the changes and push the site again. You can also look at the elements as they load in F12 Browser Tools if you are having trouble finding out which image, css, or js file came in over http://

I'll redeploy and after a few tries, boom.

https://www.babysmash.com

And there's the cert. Note its expiration date. If the Site Extension does its job it will renew the cert before it expires!

A Let's Encrypt SSL Cert

Once I knew what I was doing, it took about 10 minutes per site. Thanks Simon for your work, and while there are multiple ways to do this, I found Simon's App Service Extension the easiest. I hope the Azure team comes up with a "One Click Solution" to this.

What do you think?


Sponsor: Learn how .NET in 2018 addresses the challenges developers are working on with future-focused technology. Get the new whitepaper on "The State of .NET in 2018" by the Progress Telerik team!



© 2018 Scott Hanselman. All rights reserved.
     

Gain application insights for Big Data solutions using Unravel data on Azure HDInsight

$
0
0

Unravel on HDInsight enables developers and IT Admins to manage performance, auto scaling & cost optimization better than ever.

We are pleased to announce Unravel on Azure HDInsight Application Platform. Azure HDInsight is a fully-managed open-source big data analytics service for enterprises. You can use popular open-source frameworks (Hadoop, Spark, LLAP, Kafka, HBase, etc.) to cover broad range of scenarios such as ETL, Data Warehousing, Machine Learning, IoT and more. Unravel provides comprehensive application performance management (APM) for these scenarios and more. The application helps customers analyze, optimize, and troubleshoot application performance issues and meet SLAs in a seamless, easy to use, and frictionless manner. Some customers report up to 200 percent more jobs at 50 percent lower cost using Unravel’s tuning capability on HDInsight.

To learn more please join Pranav Rastogi, Program Manager at Microsoft Azure Big Data, and Shivnath Babu, CTO at Unravel, in a webinar on June 13 for how to build fast and reliable big data apps on Azure while keeping cloud expenses within your budget.

How complex is guaranteeing an SLA on a Big Data solution?

The inherent complexity of big data systems, disparate set of tools for monitoring, and lack of expertise in optimizing these open source frameworks create significant challenges for end-users who are responsible for guaranteeing SLAs. Users today have to monitor their applications with Ambari which only provides infrastructure metrics to administer the cluster health, performance and utilization. Big Data solutions use a variety of open source frameworks. Monitoring applications running across all of these frameworks is a daunting task. Users have to troubleshoot issues manually by analyzing logs from YARN, Hive, Tez, LLAP, Pig, Spark, Kafka, etc. To get good performance, users may have to change settings in Spark executors, YARN queues, Kafka topic configuration, region servers in HBase, storage throttling, sizing of compute and more. Unravelling this complexity is an art and science.

Monitoring Big Data applications now made easy with Unravel

Unravel on HDInsight provides intelligent applications and operations management for Big Data a breeze. Its Application Performance Management correlates full-stack performance and provides automated insights and recommendations. Users can now analyze troubleshoot and optimize performance with ease. Here are the key value propositions of Unravel on HDInsight:

Proactive alerting and automatic actions

  • Proactive alerts on applications missing SLAs, violating usage policies or affecting other applications running on the cluster.
  • Automatic actions to resolve above issues using dynamic thresholds and company defined policies such as killing bad applications, re-directing apps based on priority levels.

Analyze app performance

  • Intuitive, end-to-end view of application performance with drill-down capabilities into bottlenecks, problem areas and errors.
  • Correlated insights into all factors affecting app performance such as resource allocation, container utilization, poor configuration settings, task execution pattern, data layout, resource contention and more.
  • Rapid detection of performance and cost issues caused by applications.

Following is an example of how Unravel diagnosed poor resource usage and high cost caused by a Hive on Tez application. Tuning the application using recommendations provided by Unravel reduced the cost of running this application by 10 times.

cost_wastage

Here is an example of cluster utilization after using Unravel. Unravel enables you to utilize resources efficiently and auto scale the cluster based on SLA needs, which results in cost savings.

auto_scaling

AI-driven intelligent recommendation engine

  • Recommend optimal values for fastest execution and/or least resource utilization including: data parallelism, optimal container size, number of tasks, etc.

The example below shows how Unravel’s AI-driven engine provides actionable recommendations for optimal performance of a Hive query.

tez_recommendations

  • Identify and automatically fix issues related to poor execution, skew, expensive joins, too many mappers/reducers, caching, etc.

Following is an example of Unravel automatically detecting lag in a real-time IoT application using Spark Streaming & Kafka and recommending a solution.

Streaming recommendations

Get started with Unravel on HDInsight

Customers can easily install Unravel on HDInsight using a single click in Azure portal. Unravel provides a live view into the behavior of big data applications using open source frameworks such as Hadoop, Hive, Spark, Kafka, LLAP and more on Azure HDInsight.

Installing unravel

After installing you can launch the Unravel application from the Applications blade of the cluster as shown below.

Launching unravel data

Try Unravel now on HDInsight!

Attend the webinar!

To learn more please join Pranav Rastogi, Program Manager at Microsoft Azure Big Data, and Shivnath Babu, CTO at Unravel, in a webinar on June 13 on how to build fast and reliable big data apps on Azure while keeping cloud expenses within your budget.

Summary

Azure HDInsight is the most comprehensive platform offering a wide range of fully-managed frameworks such as Hadoop, Spark, Hive, LLAP, Kafka, HBase, Storm, and more. We are pleased to announce the expansion of HDInsight Application Platform to include Unravel. Unravel provides comprehensive Application Performance Management (APM) across various open source analytical frameworks to help customers analyze, optimize, and troubleshoot application performance issues and meet SLAs in a seamless, easy to use, and frictionless manner.

Stay up-to-date on the latest Azure HDInsight news and features by following us on Twitter #HDInsight and @AzureHDInsight. For questions and feedback, please reach out to AskHDInsight@microsoft.com.

Getting started with Apache Spark on Azure Databricks

$
0
0

Data is growing at an astounding rate, with an estimated 2.5 quintillion bytes being created everyday. Data analysts predict that by 2020, the world’s collected data will quadruple. In the sea of all this data, we are continually exploring new ways of analyzing and interpreting data in a way that’s productive, meaningful and insightful.

Designed in collaboration with the original founders of Apache® Spark™, Azure Databricks combines the best of Databricks and Microsoft Azure to help customers accelerate innovation with streamlined workflows, an interactive workspace and one-click set up. Azure Databricks is an analytics engine built for large scale data processing that enables collaboration between data scientists, data engineers and business analysts.

Azure Databricks can be used to run workloads faster and write applications in the language of your choice, whether that’s Scala, SQL, R or Python. When in sync with Azure Databricks, businesses can innovate within the safe, protected cloud environment of Microsoft Azure and benefit from the native integration with other Azure services such as Power BI, Azure SQL Data Warehouse, and Azure Cosmos DB.

When you’re getting started with Apache Spark on Azure Databricks, you’ll have questions that are unique to your businesses implementation and use case. In this introductory webinar provided by Microsoft, we’ll answer your questions in real-time and cover the following common use cases:

  • RDD’s, Data Frames, Data Sets and other fundamentals of Apache Spark
  • How to set up Azure Databricks
  • How to use Azure Databricks interactive notebooks, providing you a collaborative workspace for your analytics team
  • How to put your work into immediate production by scheduling notebooks

Register for this webinar to get answers to your questions and learn how to get started with Apache Spark on Azure Databricks.

Digging in with Azure IoT: Our interactive developer guide

$
0
0

The Internet of Things (IoT) presents many compelling opportunities for developers With the right tools and guidance, it doesn’t need to be intimidating. The conceptual and technical differences between IoT and traditional web and application are relatively easy to grasp. Plus, Microsoft offers a full range of managed services to make it even easier, while enabling you to experiment and scale. If your main skillset is in the Microsoft ecosystem, you can jump right in with Visual Studio Code, .NET, and Azure. If you prefer other languages and tools, our open-source SDKs will enable you to work with Azure solutions while staying in your comfort zone. Either way, you can get started quickly with our Azure IoT developer guide that will take you through common application patterns and tutorials that will have you up and running fast.

The IoT Application pattern

When it comes to knowing where to start, it can help to have a conceptual framework that organizes services and technologies in a logical way. We tend to think of IoT architecture as being divided into three main categories, each of which has considerations distinct from traditional development architectures:

  • Things: The devices and sensors in the field that are collecting data, sending and receiving messages, and in many cases performing analytics and initiating actions. IoT solutions typically provide a means to provision, connect, secure, and manage things that can be difficult to access or have very low processing power.
  • Insights: The useful information that can be gleaned from things, which can roughly be divided into device data (about the state of the device itself) and telemetry data (the measurements collected by the device, such as movement, temperature, location, and so on). The challenge with IoT insights is figuring out which data is useful and processing it at massive scales in real time.
  • Actions: This is where you get actual value from IoT things and insights. You can think of actions as being divided into a “hot path” from real-time data and a “cold path” where data is stored for later analysis. Building a solution that enables you to deliver insights to the right endpoints and incorporate new ones as time goes on, this is critical to long-term value.

Starting from a simple framework like this can help you conceptualize what you need to get to a working solution. For example, when it comes to “things,” do you have devices in place already or are you going to buy or build them? How will you manage insights. That is, will your devices always be connected to the cloud, or do you need to support intermittent connectivity? Do you need real-time processing, or is it okay to send information to a data lake? And when planning actions, what are you hoping to achieve from a business standpoint? Do you need to show results in a dashboard, apply machine learning, trigger alerts, or something else?

The best way to get a sense of what you need is to get your hands dirty and build something. This can seem daunting from the outset, given that people routinely talk about millions of devices streaming terabytes of data. Luckily, with cloud-based managed services such as those offered by Azure IoT, you can start playing with IoT services quickly, easily, and without a large upfront capital investment. You can even try for free with an Azure free trial. Our IoT developer’s guide is a great place to start. It takes you through things, insights, and actions, mapping Azure IoT services to each level of IoT, and then provides a step-by-step plan for experimenting with live IoT capabilities using languages and tools of your choice. You might find that creating IoT solutions can not only be pretty straightforward, but also fun.

Soft delete for Azure Storage Blobs generally available

$
0
0

Today we are excited to announce general availability of soft delete for Azure Storage Blobs! The feature is available in all regions, both public and private.

When turned on, soft delete enables you to save and recover your data where blobs or blob snapshots are deleted. This protection extends to blob data that is erased as the result of an overwrite.

How does it work?

When data is deleted, it transitions to a soft deleted state instead of being permanently erased. When soft delete is on and you overwrite data, a soft deleted snapshot is generated to save the state of the overwritten data. Soft deleted objects are invisible unless explicitly listed. You can configure the amount of time soft deleted data is recoverable before it is permanently expired.

storage-blob-soft-delete-overwrite-then-delete (2)

Soft deleted data is grey, while active data is blue. More recently written data appears beneath older data. When B0 is overwritten with B1, a soft deleted snapshot of B0 is generated.​ When the blob is deleted, the root (B1) also moves into a soft deleted state.

Soft delete is 100 percent backwards compatible; you don’t have to make changes to your applications to take advantage of the protections this feature affords. With this GA announcement, we have added support for tiering blobs with soft deleted snapshots. When Set Blob Tier is called on a blob with soft deleted snapshots, the snapshots will remain in the original storage tier and expire based on the retention period you configured. The base blob will move to the new tier.

When you create a new account, soft delete is off by default. Soft delete is also off by default for existing storage accounts. You can toggle the feature on and off at any time during the life of a storage account. Object-level soft delete is available for all storage account types and all storage tiers. It does not protect against container or account deletions. To learn how to protect a storage account from accidental deletes, please see the Azure Resource Manager article Lock Resources to Prevent Unexpected Changes.

Soft deleted data is billed at the same rate as active data. For more details on prices for Azure Blob Storage in general, check out the Azure Blob Storage Pricing Page.

Getting started

Soft delete is supported by Azure Portal, .NET Client Library (version 9.0.0), Java Client Library (version 7.0.0), Python Client Library (version 1.1.0), Node.js Client Library (version 2.8.0), PowerShell (version 5.3.0) and CLI 2.0 (version 2.0.27). You can also directly use the Storage Services REST API as always. Soft delete is supported by REST API version 2017-07-29 and greater. In general, we always recommend using the latest version regardless of whether you are using this feature.

storage-blob-soft-delete-portal-configuration

To enable soft delete using the Azure Portal, navigate to the "Soft delete" option under "Blob Service." Then, click "Enabled" and enter the number of days you want to retain soft deleted data.

For more details on the feature see the soft delete documentation as well as this soft delete code sample.

Soft delete helps ensure that you can recover accidentally deleted or modified blob data. Soft delete is a key part of an overall data protection strategy that includes Azure Resource Manager locks as well as the ZRS, GRS, and RA-GRS replication tiers. We look forward to hearing feedback about this feature on this post or at Azure Feedback.

Survey of business leaders highlights the importance of digitally enabling Firstline Workers

$
0
0

Today, there are more than two billion Firstline Workers, who are often a customer’s first point of contact, making them the first to represent a company’s brand. Microsoft partnered with Harvard Business Review Analytic Services to survey the roles and importance of the Firstline Workforce within the context of digital transformation.

Of the 383 business leaders who responded to the survey:

  • 78 percent responded that connecting and empowering Firstline Workers is critical.
  • 67 percent think increased efficiency is their top driver.
  • 60 percent will adopt advanced analytics in the next two years.

Read the full analysis of the survey, Building for success at the Firstline of business, to learn more.

The post Survey of business leaders highlights the importance of digitally enabling Firstline Workers appeared first on Microsoft 365 Blog.

New Updates for Azure Development in Visual Studio

$
0
0

With Microsoft Build 2018 in the books, we just released a series of updates to our Visual Studio Tools for Azure Development. This improves productivity and security for developers building and diagnosing web applications, containers and microservices. To help you make the most of these new advancements for developing in the Azure cloud, we’ve compiled a “best of” list announced at this year’s event.

Web App Development

We improved Web publish by adding App Service Linux as a target. This allows you to build a .NET Core app and easily publish it to App Service running on native Linux, unlike today where you must build custom containers to deploy on Linux.

As a part of this publish flow you will see that Web Apps and Functions allow you to create storage accounts as a part of the experience. This is helpful when your Web apps or Functions leverage storage accounts as a way to trigger events based off of actions in Storage, or if you’re interested in storing information in the cloud rather than within your application.

For those developing cross-platform applications, we’ve also made improvements to the publish flow of ASP.NET Core applications. You can now configure your ASP.NET Core application to publish as a standalone app, and change from Release to Debug prior to publishing.

To help you diagnose and address issues with your code, we added new features for the Application Insights Profiler and Snapshot debugger. It is now easier to enable these features without requiring a redeploy, run the profiler on-demand, and get a snapshot health check to identify reasons for missing a snapshot. You also have more entry points for accessing the debugger from within Visual Studio. For example, you can launch the Snapshot Debugger inside Visual Studio from the Debug -> Attach Snapshot Debugger menu.

Connecting your code with the cloud

When you’re first starting with your application, it can be hard to figure out all the ways you can connect Azure services with your solution. Connected Services is a Visual Studio experience that allows you to easily integrate your code with Azure services, even provisioning them if necessary. This experience has added support for three more Azure offerings in addition to existing ones:

Containers & Microservices

Support for containers keeps growing with new and updated features for building cloud-native applications with microservices and containers. The “Add Docker Support” functionality has been simplified starting with Visual Studio 15.8 preview 1. Now, when you add Docker support to an ASP.NET Core web application, Visual Studio will create a Dockerfile for you, without a separate Docker Compose project. You can still debug your application inside a Docker container or publish to Azure App Service for Containers or Azure Container Registry, just as before.

If you wish to add Docker Compose support to your project, you can use the new “Add Container Orchestration” option on the project’s context menu. This will allow you to choose between different Orchestration options, such as Service Fabric. We’re also excited to announce the private preview of tools for publishing your application to Kubernetes. Once you have the private preview tooling installed, the “Add Container Orchestration” option will also allow you to add the appropriate files for running your application in Kubernetes, including a Dockerfile as well as Helm charts. In addition, you’ll see a new option for publishing directly to Azure Kubernetes Service. If you’re interested in trying out the private preview, sign up for the private preview here.

Whatever option you choose for your orchestrator, you also get additional features if you’re storing your base images in Azure Container Registries, as support for geo-replication is now generally available. This feature helps safeguard your images by making sure they’re cloned to other data centers, so you don’t have to worry about the images becoming unavailable.

If you’re interested in building microservices with the reliability, mission-critical performance and scale of Service Fabric, but without cluster management and patching operations you now have the option of using Azure Service Fabric Mesh, which was announced in private preview, with a public preview becoming available in coming weeks.

Storage Explorer

Although not part of Visual Studio, Storage Explorer is an integral tool for examining the contents of your Storage Accounts and other storage resources. Storage Explorer is now Generally Available, with the latest update including an improved sign-in experience and increased accessibility support.

If you’re interested in getting base applications to test out any of these updates, you can download a sample from our revamped Azure samples page.

Lastly, moving applications to the cloud in order to take advantage of all the benefits of Azure services is a major undertaking for some companies, but it can be hard to navigate this process. You can learn more about it by listening to our guest spot on this podcast about lifting and shifting applications to the cloud.

We hope you enjoy these developer features! We look forward to seeing your feedback on them. In the meantime, you can find more information in the Azure Developer Center.

Thanks,
-The Azure Developer Experiences team

Cristy Gonzalez, Program Manager, Azure Developer Experiences

Cristy has been a PM at Microsoft since 2013. In that time, she has worked on the Azure SDK, Storage Explorer, Azure Diagnostics, and Azure developer services like DevTest Labs and ACR. She’s currently working on experiences for users getting started with Azure.


ASP.NET Core 2.1.0 now available

$
0
0

Today we're thrilled to announce the release of ASP.NET Core 2.1.0! This is the latest release of our open-source and cross-platform web framework for .NET and it's now ready for production use. Get started with ASP.NET Core 2.1 today!

New features in this release include:

Check out What's New in ASP.NET Core 2.1 in the ASP.NET Core docs to learn more about these features. For a complete list of all the changes in this release, see the release notes.

ASP.NET Core 2.1.0 is available with .NET Core 2.1.0 along with Entity Framework Core 2.1.0, which you can read about in the corresponding blog posts.

Get started

You can Get started with ASP.NET Core 2.1 in 10 minutes by installing the latest .NET Core SDK and latest Visual Studio release. Then follow the tutorial instructions to create your first ASP.NET Core app.

Migrating an ASP.NET Core 2.0.x project to 2.1.0

For instructions on migrating to ASP.NET Core 2.1 see Migrating from ASP.NET Core 2.0.x project to 2.1.0 in ASP.NET Core documentation.

Deploy to Azure

ASP.NET Core 2.1 is available to be used on Azure App Service today. Publish your ASP.NET Core 2.1 app to Azure App Service.

For apps that use ASP.NET Core SignalR, configure your app to use the new Azure SignalR Service (public preview) to scale the real-time capabilities of your app.

Give feedback

We hope you enjoy using the new features and improvements in ASP.NET Core 2.1.0. If you have any questions or find any issues with this release let us know by filing issues on GitHub.

Thanks for using ASP.NET Core!

Announcing Entity Framework Core 2.1

$
0
0

Today we are excited to announce the release of Entity Framework (EF) Core 2.1. This is the latest production-ready release of our open-source and cross-platform data access technology. We are releasing today alongside .NET Core 2.1 and ASP.NET Core 2.1. EF Core 2.1 targets .NET Standard 2.0 and so also runs on .NET Core 2.0 and .NET Framework 4.6.1 or later.

New features

More than 500 bug fixes and enhancements have been made since EF Core 2.0 was released. Because EF Core is developed as an open-source project on GitHub, a complete list of these changes can be found in the GitHub issue tracker:

The following sections highlight some of the most significant new features.

Lazy loading

EF Core now contains the necessary building blocks for anyone to author entity classes that can load their navigation properties on demand. We have also created a new package, Microsoft.EntityFrameworkCore.Proxies, that leverages those building blocks to produce lazy loading proxy classes based on minimally modified entity classes (e.g. classes with virtual navigation properties).

Read the documentation on lazy loading for more information about this topic.

Parameters in entity constructors

As one of the required building blocks for lazy loading, we enabled the creation of entities that take parameters in their constructors. You can use parameters to inject property values, lazy loading delegates, and services.

Read the documentation on entity constructors with parameters for more information about this topic.

Value conversions

Until now, EF Core could only map properties of types natively supported by the underlying database provider. Values were copied back and forth between columns and properties without any transformation. Starting with EF Core 2.1, value conversions can be applied to transform the values obtained from columns before they are applied to properties, and vice versa. We have a number of conversions that can be applied by convention as necessary, as well as an explicit configuration API that allows registering custom conversions between columns and properties. Some of the application of this feature are:

  • Storing enums as strings
  • Mapping unsigned integers with SQL Server
  • Automatic encryption and decryption of property values

There are currently some known limitations with value conversions:

  • Support is currently very limited for store-generated values (such as Identity columns) when a value converter is configured on the property. See issue #11597
  • Some query translations may not be performed correctly if a normally translated type (such as DateTime) is converted to a different type in the database. In such cases it may be necessary to force in-memory evaluation of the query. See issue #10265.

Read the documentation on value conversions for more information about this topic.

LINQ GroupBy translation

Before version 2.1, in EF Core the GroupBy LINQ operator was always be evaluated in memory. We now support translating it to the SQL GROUP BY clause in most common cases.

This example shows a query with GroupBy used to compute various aggregate functions:

var query = context.Orders
    .GroupBy(o => new { o.CustomerId, o.EmployeeId })
    .Select(g => new
        {
          g.Key.CustomerId,
          g.Key.EmployeeId,
          Sum = g.Sum(o => o.Amount),
          Min = g.Min(o => o.Amount),
          Max = g.Max(o => o.Amount),
          Avg = g.Average(o => Amount)
        });

The corresponding SQL translation looks like this:

SELECT [o].[CustomerId], [o].[EmployeeId],
    SUM([o].[Amount]), MIN([o].[Amount]), MAX([o].[Amount]), AVG([o].[Amount])
FROM [Orders] AS [o]
GROUP BY [o].[CustomerId], [o].[EmployeeId];

Note that some queries that compose GroupBy with other LINQ operators and query shapes may still be evaluated in memory. Also, there are some edge cases that now have issues where GroupBy was previously evaluated in memory. See the GitHub issue tracker for currently open issues involving GroupBy.

Data Seeding

With the new release it will be possible to provide initial data to populate a database. Unlike in EF6, seeding data is associated to an entity type as part of the model configuration. Then EF Core migrations can automatically compute what insert, update or delete operations need to be applied when upgrading the database to a new version of the model.

As an example, you can use this to configure seed data for a Post in OnModelCreating:

modelBuilder.Entity<Post>().HasData(new Post{ Id = 1, Text = "Hello World!" });

Read the documentation on data seeding for more information about this topic.

Query types

An EF Core model can now include query types. Unlike entity types, query types do not have keys defined on them and cannot be inserted, deleted or updated (i.e. they are read-only), but they can be returned directly by queries. Some of the usage scenarios for query types are:

  • Mapping to views without primary keys
  • Mapping to tables without primary keys
  • Mapping to queries defined in the model
  • Serving as the return type for FromSql() queries

Read the documentation on query types for more information about this topic.

Include for derived types

It will be now possible to specify navigation properties only defined on derived types when writing expressions for the Include method. For the strongly typed version of Include, we support using either an explicit cast or the as operator. We also now support referencing the names of navigation property defined on derived types in the string version of Include:

var option1 = context.People.Include(p => ((Student)p).School);
var option2 = context.People.Include(p => (p as Student).School);
var option3 = context.People.Include("School");

System.Transactions support

We have added the ability to work with System.Transactions features such as TransactionScope. This will work on both .NET Framework and .NET Core when using database providers that support it.

Read the documentation on System.Transactions for more information about this topic.

Better column ordering in initial migration

Based on customer feedback, we have updated migrations to initially generate columns for tables in the same order as properties are declared in classes. Note that EF Core cannot change order when new members are added after the initial table creation.

Optimization of correlated sub-queries

We have improved our query translation to avoid executing “N + 1” SQL queries in many common scenarios in which the use of a navigation property in the projection leads to joining data from the root query with data from a correlated sub-query. The optimization requires buffering the results form the sub-query, and we require that you modify the query to opt-in the new behavior.

As an example, the following query normally gets translated into one query for Customers, plus N (where “N” is the number of customers returned) separate queries for Orders:

var query = context.Customers.Select(
    c => c.Orders.Where(o => o.Amount  > 100).Select(o => o.Amount));

By including ToList() in the right place, you indicate that buffering is appropriate for the Orders, which enable the optimization:

var query = context.Customers.Select(
    c => c.Orders.Where(o => o.Amount  > 100).Select(o => o.Amount).ToList());

Note that this query will be translated to only two SQL queries: One for Customers and the next one for Orders.

OwnedAttribute

It is now possible to configure owned entity types by simply annotating the type with [Owned] and then making sure the owner entity is added to the model:

[Owned]
public class StreetAddress
{
    public string Street { get; set; }
    public string City { get; set; }
}

public class Order
{
    public int Id { get; set; }
    public StreetAddress ShippingAddress { get; set; }
}

The new attribute is defined in a new package: Microsoft.EntityFrameworkCore.Abstractions. This package contains attributes and interfaces that you can use in your projects to light up EF Core features without taking a dependency on EF Core as a whole.

Read the documentation on owned entity types for more information about this topic.

EF commands in the SDK

The dotnet-ef commands now ship as part of the .NET Core SDK, so it will no longer be necessary to use DotNetCliToolReference in the project to be able to use migrations or to scaffold a DbContext from an existing database.

Note that these new commands support EF Core 2.0 and 2.1. These commands do not support EF Core 1.0 and 1.1. Instead a global.json file should be used to pin an earlier SDK and the DotNetCliToolReference to the old tools must be retained.

State change events

New Tracked And StateChanged events on ChangeTracker can be used to write logic that reacts to entities entering the DbContext or changing their state.

FromSql code analyzer

A new code analyzer is included with EF Core that detects possibly unsafe usages of our raw-SQL APIs (for example, FromSql) in which SQL parameters are not generated.

Obtaining the bits

The new bits are available in NuGet as individual packages, as part of the ASP.NET Core 2.1 meta-package and in the .NET Core 2.1 SDK, also released today.

The recommended way to obtain the new packages for ASP.NET Core applications is through installation of the new SDK. See the details in the Getting started section of the ASP.NET Core blog post.

For other types of application, either the SDK can be installed or the packages can be updated using the dotnet command line tool or NuGet.

Database provider

EF Core also requires a database provider for the database system you wish to use. The database providers for Microsoft SQL Server and the in-memory database are included in the ASP.NET Core meta-package. For other providers and for non-ASP.NET applications, the provider should be installed from a NuGet package. For example, using dotnet on the command-line:

$ dotnet add package Microsoft.EntityFrameworkCore.Sqlite

It is recommended that you also add a direct reference to the relational provider package to help ensure that you get all the newest EF Core bits. For example:

$ dotnet add package Microsoft.EntityFrameworkCore.Relational

When updating packages, make sure that all EF Core packages are updated to the 2.1.0 version. Mixing EF Core or infrastructure packages from older .NET Core versions (including previous 2.1 preview/rc bits) will likely cause errors.

Provider compatibility

As we mentioned in previous announcements, some of the new features in 2.1, such as value conversions, require an updated database provider. However, it was our original goal that existing providers developed for EF Core 2.0 would be compatible with EF Core 2.1 as long as you didn’t try to use the new features.

In practice, testing has shown that some of the EF Core 2.0 providers are not going to be compatible with 2.1. Also, there have been changes in the code necessary to support the new features since Preview 1 and Preview 2. Therefore, we recommend that you use a provider that has been updated for EF Core 2.1.

We have news that some of these updated providers will be available within days of this announcement. Others may take longer.

We have been working and will continue to work with provider writers to make sure we identify and address any issues with the upgrade. In the particular case of Pomelo.EntityFrameworkCore.MySql, we are actively working with the developers to help them get it ready for 2.1.

If you experience any new incompatibility, please report it by creating an issue in our GitHub repository.

Thank you!

As always, the entire Entity Framework team wants to express our deep gratitude to everyone who has helped in making this release better by trying early builds, providing feedback, reporting bugs, and contributing code.

Please try EF Core 2.1, and keep posting any new feedback to our issue tracker!

Announcing .NET Core 2.1

$
0
0

We’re excited to announce the release of .NET Core 2.1. It includes improvements to performance, to the runtime and tools. It also includes a new way to deploy tools as NuGet packages. We’ve added a new primitive type called Span<T> that operates on data without allocations. There are many other new APIs, focused on cryptography, compression, and Windows compatibility. It is the first release to support Alpine Linux and ARM32 chips. You can start updating existing projects to target .NET Core 2.1 today. The release is compatible with .NET Core 2.0, making updating easy.

ASP.NET Core 2.1 and Entity Framework Core 2.1 are also releasing today.

You can download and get started with .NET Core 2.1, on Windows, macOS, and Linux:

Docker images are available at microsoft/dotnet for .NET Core and ASP.NET Core.

The Build 2018 conference was earlier this month. We had several in-depth presentations on .NET Core. Check out Build 2018 sessions for .NET on Channel9.

You can see complete details of the release in the .NET Core 2.1 release notes. Related instructions, known issues, and workarounds are included in the releases notes. Please report any issues you find in the comments or at dotnet/core #1614

Thanks for everyone that contributed to .NET Core 2.1. You’ve helped make .NET Core a better product!

Long-term Support

.NET Core 2.1 will be a long-term support (LTS) release. This means that it is supported for three years. We recommend that you make .NET Core 2.1 your new standard for .NET Core development.

We intend to ship a small number of significant updates in the next 2-3 months and then officially call .NET Core 2.1 an LTS release. After that, updates will be targeted on security, reliability, and adding platform support (for example, Ubuntu 18.10). We recommend that you start adopting .NET Core 2.1 now. For applications in active development, there is no reason to hold off deploying .NET Core 2.1 into production. For applications that will not be actively developed after deployment, we recommend waiting to deploy until .NET Core 2.1 has been declared as LTS.

There are a few reasons to move to .NET Core 2.1:

  • Long-term support.
  • Superior performance and quality.
  • New platform support, such as: Ubuntu 18.04, Alpine, ARM32.
  • Much easier to manage platform dependencies in project files and with self-contained application publishing.

We had many requests to make .NET Core 2.0 an LTS release. In fact, that was our original plan. We opted to wait until we had resolved various challenges managing platform dependencies (the last point above). Platform dependency management was a significant problem with .NET Core 1.0 and has gotten progressively better with each release. For example, you will notice that the ASP.NET Core package references no longer include a version number with .NET Core 2.1.

Platform Support

.NET Core 2.1 is supported on the following operating systems:

  • Windows Client: 7, 8.1, 10 (1607+)
  • Windows Server: 2008 R2 SP1+
  • macOS: 10.12+
  • RHEL: 6+
  • Fedora: 26+
  • Ubuntu: 14.04+
  • Debian: 8+
  • SLES: 12+
  • openSUSE: 42.3+
  • Alpine: 3.7+

Note: The runtime ID for Alpine was previously alpine-3.6. There is now a more generic runtime ID for Alpine and similar distros, called linux-musl, to support any Linux distro that uses musl libc. All of the other runtime IDs assume glibc.

Chip support follows:

  • x64 on Windows, macOS, and Linux
  • x86 on Windows
  • ARM32 on Linux (Ubuntu 18.04+, Debian 9+)

Note: .NET Core 2.1 is supported on Raspberry Pi 2+. It isn’t supported on the Pi Zero or other devices that use an ARMv6 chip. .NET Core requires ARMv7 or ARMv8 chips, like the ARM Cortex-A53.

.NET Core Tools

.NET Core now has a new deployment and extensibility mechanism for tools. This new experience is very similar to and was inspired by NPM global tools. You can create your own global tools by looking at the dotnetsay tools sample.

You can try the new tools experience with the dotnetsay tool with the following commands:

dotnet tool install -g dotnetsay
dotnetsay

.NET Core tools are .NET Core console apps that are packaged and acquired as NuGet packages. By default, these tools are framework-dependent applications and include all of their NuGet dependencies. This means that .NET Core tools run on all .NET Core supported operating system and chip architecture by default, with one set of binaries. By default, the dotnet tool install command looks for tools on NuGet.org. You can use your own NuGet feeds instead.

At present, .NET Core Tools only support global install and require the -g argument to be installed. We’re working on various forms of local install, too, and plan to deliver that in a subsequent release.

We expect a whole new ecosystem of tools to establish itself for .NET. @matemcmaster maintains a list of dotnet tools. You might also check out his dotnet-serve tool.

The following existing DotNetCliReferenceTool tools have been converted to in-box tools.

  • dotnet watch
  • dotnet dev-certs
  • dotnet user-secrets
  • dotnet sql-cache
  • dotnet ef

Remove project references to these tools when you upgrade to .NET Core 2.1.

Build Performance Improvements

Improving the performance of the .NET Core build was perhaps the biggest focus of the release. It is greatly improved in .NET Core 2.1, particularly for incremental builds. These improvements apply to both dotnet build on the command line and to builds in Visual Studio.

The following image shows the improvements that we’ve made, compared to .NET Core 2.0. We focused on large projects, as you can see from the image.

.NET Core 2.1 Incremental Build-time performance improvements

.NET Core 2.1 Incremental Build-time performance improvements

Note: 2.1 in the image refers to the 2.1.300 SDK version.

Note: These benchmarks were produced from projects at mikeharder/dotnet-cli-perf.

We added long-running servers to the .NET Core SDK to improve the performance of common development operations. The servers are additional processes that run for longer than a single dotnet build invocation. Some of these are ports from the .NET Framework and others are new.

The following SDK build servers have been added:

  • VBCSCompiler
  • MSBuild worker processes
  • Razor server

The primary benefit of these servers is that they skip the need to JIT compile large blocks of code on every dotnet build invocation. They auto-terminate after a period of time. See release notes for more information on finer control of these build servers.

Runtime Performance Improvements

See Performance Improvements in .NET Core 2.1 for an in-depth exploration of all the performance improvements in the release.

Networking Performance Improvements

We built a new from-the-ground-up HttpClientHandler called SocketHttpHandlerto improve networking performance. It’s a C# implementation of HttpClient based on .NET sockets and Span<T>.

SocketsHttpHandler is now the default implementation for HttpClient. The biggest win of SocketsHttpHandler is performance. It is a lot faster than the existing implementation. It also eliminates platform-specific dependencies and enables consistent behavior across operating systems.

See the .NET Core 2.1 release notes for instructions on how to enable the older networking stack.

Span<T>, Memory<T>, and friends

We are entering a new era of memory-efficient and high-performance computing with .NET, with the introduction of Span<T> and related types. Today, if you want to pass the first 1000 elements of a 10,000 element array, you need to make a copy of those 1000 elements and pass that copy to your caller. That operation is expensive in both time and space. The new Span<T> type enables you to provide a virtual view of that array without the time or space cost. Span<T> is a struct, which means that you can enable complex pipelines of parsing or other computation without allocating. We are using this new type extensively in corefx for this reason.

Jared Parsons gives a great introduction in his Channel 9 video C# 7.2: Understanding Span. Stephen Toub goes into even more detail in C# – All About Span: Exploring a New .NET Mainstay.

In the most simple use case, you can cast an array to a Span<T>, as follows.

You can Slice a Span<T>, as follows.

This code produces the following output:

ints length: 100
spanInts length: 100
slicedInts length: 2
slicedInts contents
42
43
slicedInts contents
21300
43

Brotli Compression

Brotli is a general-purpose lossless compression algorithm that compresses data comparable to the best currently available general-purpose compression methods. It is similar in speed to deflate but offers more dense compression. The specification of the Brotli Compressed Data Format is defined in RFC 7932. The Brotli encoding is supported by most web browsers, major web servers, and some CDNs (Content Delivery Networks). The .NET Core Brotli implementation is based around the c code provided by Google at google/brotli. Thanks, Google!

Brotli support has been added to .NET Core 2.1. Operations may be completed using either the stream-based BrotliStream or the high-performance span-based BrotliEncoder/BrotliDecoder classes. You can see it used in the following example.

This code produces the following output:

Request URL: https://raw.githubusercontent.com/dotnet/core/master/README.md
Initial content length: 2244
Compressed content length: 727
Decompressed content length: 2244
Compression ratio: 67.6%
First 10 lines of decompressed content

# .NET Core Home
The dotnet/core repository is a good starting point for .NET Core.
The latest major release is [.NET Core 2.1](release-notes/2.1/2.1.0.md). The latest patch updates are listed in [.NET Core release notes](release-notes/README.md)
## Download the latest .NET Core SDK
* [.NET Core 2.1 SDK](release-notes/download-archives/2.1.0-download.md)

New Cryptography APIs

The following enhancements have been made to .NET Core cryptography APIs:

  • New SignedCms APIs  System.Security.Cryptography.Pkcs.SignedCms is now available in the System.Security.Cryptography.Pkcspackage. The .NET Core implementation is available to all .NET Core platforms and has parity with the class from .NET Framework. See: dotnet/corefx #14197.
  • New X509Certificate.GetCertHash overload for SHA-2 — New overloads for X509Certificate.GetCertHash and X509Certificate.GetCertHashString accept a hash algorithm identifier to enable callers to get certificate thumbprint values using algorithms other than SHA-1. dotnet/corefx #16493.
  • New Span<T>-based cryptography APIs — Span-based APIs are available for hashing, HMAC, (cryptographic) random number generation, asymmetric signature generation, asymmetric signature processing, and RSA encryption.
  • Rfc2898DeriveBytes performance improvements — The implementation of Rfc2898DeriveBytes (PBKDF2) is about 15% faster, based on using Span<T>-based. Users who benchmarked an iteration count for an amount of server time may want to update iteration count accordingly.
  • Added CryptographicOperations class  CryptographicOperations.FixedTimeEquals takes a fixed amount of time to return for any two inputs of the same length, making it suitable for use in cryptographic verification to avoid contributing to timing side-channel information. CryptographicOperations.ZeroMemory is a memory clearing routine that cannot be optimized away via a write-without-subsequent-read optimization.
  • Added static RandomNumberGenerator.Fill — The static RandomNumberGenerator.Fill will fill a Span with random values using the system-preferred CSPRNG, and does not require the caller to manage the lifetime of an IDisposable resource.
  • Added support for RFC 3161 cryptographic timestamps — New API to request, read, validate, and create TimestampToken values as defined by RFC 3161.
  • Add Unix EnvelopedCms — The EnvelopedCms class has been added for Linux and macOS.
  • Added ECDiffieHellman — Elliptic-Curve Diffie-Hellman (ECDH) is now available on .NET Core via the ECDiffieHellman class family with the same surface area as .NET Framework 4.7.
  • Added RSA-OAEP-SHA2 and RSA-PSS to Unix platforms — Starting with .NET Core 2.1 the instance provided by RSA.Create() can always encrypt or decrypt with OAEP using a SHA-2 digest, as well as generate or validate signatures using RSA-PSS

Windows Compatibility Pack

When you port existing code from the .NET Framework to .NET Core, you can use the Windows Compatibility Pack. It provides access to an additional 20,000 APIs, compared to what is available in .NET Core. This includes System.Drawing, EventLog, WMI, Performance Counters, and Windows Services. See Announcing the Windows Compatibility Pack for .NET Core for more information.

The following example demonstrates accessing the Windows registry with APIs provided by the Windows Compatibility Pack.

Self-contained application publishing

dotnet publish now publishes self-contained applications with a serviced runtime version. When you publish a self-contained application with the new SDK, your application will include the latest serviced runtime version known by that SDK. When you upgrade to the latest SDK, you’ll publish with the latest .NET Core runtime version. This applies for .NET Core 1.0 runtimes and later.

Self-contained publishing relies on runtime versions on NuGet.org. You do not need to have the serviced runtime on your machine.

Using the .NET Core 2.0 SDK, self-contained applications are published with .NET Core 2.0.0 Runtime unless a different version is specified via the RuntimeFrameworkVersion property. With this new behavior, you’ll no longer need to set this property to select a higher runtime version for self-contained application. The easiest approach going forward is to always install and publish with the latest SDK.

Docker

Docker images for .NET Core 2.1 are available at microsoft/dotnet on Docker Hub. We’ve made a few changes relative to .NET Core 2.0. We have consolidating the set of Docker Hub repositories that we use for .NET Core and ASP.NET Core. We will use microsoft/dotnet as the only repository that we publish to for .NET Core 2.1 and later releases.

We added a set of environment variables to .NET Core images to make it easier to host ASP.NET Core sites at any .NET Core image layer and to enable dotnet watch in SDK container images without additional configuration.

.NET Core Docker Samples have been moved to the dotnet/dotnet-docker repo. The samples have been updated for .NET Core 2.1. New samples have been added, including Hosting ASP.NET Core Images with Docker over HTTPS.

For more information, see .NET Core 2.1 Docker Image Updates.

.NET Core 2.1 and Compatibility

.NET Core 2.1 is a highly compatible release. .NET Core 2.0 applications will run on .NET Core 2.1 in absence of .NET Core 2.0 being installed. This roll-forward behavior only applies to minor releases. .NET Core 1.1 will not roll-forward to 2.0, nor will .NET Core 2.0 roll-forward to 3.0.

See the .NET Core 2.1 release notes for instructions on how to disable minor-version roll-forward.

If you built .NET Core 2.1 applications or tools with .NET Core 2.1 preview releases, they must be rebuilt with the final .NET Core 2.1 release. Preview releases do not roll-forward to final releases.

Early Snap Installer Support

We have been working on bringing .NET Core to Snap and are ready to hear what you think. Snaps, along with a few other technologies, are an emerging application installation and sandboxing technology that we think is intriguing. The Snap install works well on Debian-based systems and other distros such as Fedora are having challenges that we’re working to run down. The following steps can be used if you would like to give this a try.

.NET Core 2.1 Runtime and SDK snaps are available:

  • sudo snap install dotnet-sdk --candidate --classic
  • sudo snap install dotnet-runtime-21 --candidate

Watch for future posts delving into what Snaps are about. In the meantime, we would love to hear your feedback.

Closing

.NET Core 2.1 is a big step forward for the platform. We’ve significantly improved performance, added many APIs, and added a new way of deploying tools. We’ve also added support for new Linux distros and ARM32, another CPU type. This release expands the places you can use .NET Core and makes it much more efficient everywhere.

We expect .NET Core 2.1 to be available in Azure App Service later this week.

You can see the progress we made with the .NET Core 2.1 interim releases: RC1, Preview 2, Preview 1. Thanks again to everyone who contributed to the release. It helps a lot.

Announcing Windows Community Toolkit v3.0

$
0
0

I’m excited to announce the largest update to the Windows Community Toolkit yet, version 3.0.

As announced a few weeks ago, we recently changed the name of the community toolkit to better align with all Windows developers, and today we are releasing our biggest update yet which introduces:

  • A new package for WPF and WinForms developers that includes the new Edge WebView
  • A new package for all XAML UWP developer to enable usage of eye gaze APIs in XAML
  • A new package for all .NET UWP developers to help in writing runtime API checks
  • A new package introducing new controls to access the Microsoft Graph
  • New controls and APIs in existing packages
  • Fluent updates to existing controls with support for light and dark theme
  • Updated documentation, including code examples in Visual Basic
  • Many improvements and bug fixes

Let’s take a look at some of these updates in more details.

A new modern WebView for .NET and WPF apps

Microsoft is bringing the latest Microsoft Edge rendering engine to .NET WinForms and WPF apps. However, working with the WebViewControl and WebView API may feel foreign to native .NET developers, so we’re building additional controls to simplify the experience and provide a more familiar environment. These controls wrap the WebViewControl to enable the control feel more like a native .NET WinForms or WPF control, and provide a subset of the members from that class.

A new modern WebView for .NET and WPF apps

The WinForms and WPF controls are available today in the Microsoft.Toolkit.Win32.UI.Controls package. This means that upgrading from the Trident-powered WebBrowser control to the EdgeHTML-powered WebView in your WinForms or WPF app can be as easy as dragging in a new control from the toolbox.

Visit the docs for the full documentation.

New Gaze Interaction Library to integrate eye gaze in all XAML apps

Gaze input is a powerful way to interact and use Windows and UWP apps that is especially useful as an assistive technology for users with neuro-muscular diseases (such as ALS) and other disabilities involving impaired muscle or nerve functions. The Windows 10 April 2018 Update now includes Windows eye tracking APIs. And to enable developers leveraging those APIs in their XAML apps, we are introducing the Gaze Interaction Library in the Microsoft.Toolkit.Uwp.Input.GazeInteraction package. For example, to enable eye gaze on your xaml page, add the following attached property:


xmlns:gaze="using:Microsoft.Toolkit.Uwp.Input.GazeInteraction"
      	gaze:GazeInput.Interaction="Enabled"

The API allows you to control the customize how the eye gaze works with your UI. Make sure to read this blog to learn more and visit the docs for the full documentation.

Platform Specific Analyzer

When writing platform adaptive code, developers should ensure that code checks for presence of API before calling it. The platform specific analyzer, available through the Microsoft.Toolkit.Uwp.PlatformSpecificAnalyzer nuget package, is a Roslyn analyzer for both C# and Visual Basic that can detect when you are using APIs that might now be available on all versions of Windows 10 and help you add the appropriate code checks.

Example of the Platform Specific Analyzer

Just add the nuget package to your app and the analyzer will automatically check your code as you are developing.

Microsoft Graph controls

As part of the new Microsoft.Toolkit.Uwp.UI.Controls.Graph package, we are adding four new controls to enable developers access the Microsoft Graph in their XAML apps.

ProfileCard and AadLogin

The ProfileCard control is a simple way to display a user in multiple different formats using a combination of name, image, and email. The AadLogin control leverages the Microsoft Graph service to enable basic Azure Active Directory (AAD) sign-in process.

Example of the ProfileCard control

PeoplePicker

The PeoplePicker control allows for selection of one ore more users from an organizational AD.Example of PeoplePicker

SharePointFileList

The SharePointFileList control allows the user to navigate through a folder and files and displays a simple list of SharePoint files.

The SharePointFileList control allows the user to navigate through a folder and files and displays a simple list of SharePoint files.

New controls and helpers

In addition to the new packages, the toolkit is also adding new controls and helpers to existing packages which are worth mentioning here.

CameraHelper and CameraPreview

The CameraHelper provides helper methods to easily use the available camera frame sources to preview video, capture video frames and software bitmaps. With one line of code, developers can subscribe and get real time video frames and software bitmaps as they arrive from the selected camera source.

The CameraPreview XAML control leverages the CameraHelper to easily preview the video frames in your apps.

In your xaml:


<controls:CameraPreview x:Name="CameraPreviewControl” /> 

In your C#


await CameraPreviewControl.StartAsync(); (); 
CameraPreviewControl.CameraHelper.FrameArrived += CameraPreviewControl_FrameArrived;

Read the docs for the full documentation.

UniformGrid

The UniformGrid control is a responsive layout control which arranges items in an evenly-spaced set of rows or columns to fill the total available display space. Each cell in the grid, by default, will be the same size. If you are moving UniformGrid XAML from WPF, just add the namespace prefix for the toolkit.


<controls:UniformGrid>
      <Border Background="AliceBlue" Grid.Row="1" Grid.Column="1"><TextBlock Text="1"/></Border>
      <Border Background="Cornsilk"><TextBlock Text="2"/></Border>
      <Border Background="DarkSalmon"><TextBlock Text="3"/></Border>
      <Border Background="Gainsboro"><TextBlock Text="4"/></Border>
      <Border Background="LightBlue"><TextBlock Text="5"/></Border>
      <Border Background="MediumAquamarine"><TextBlock Text="6"/></Border>
      <Border Background="MistyRose"><TextBlock Text="7"/></Border>
    </controls:UniformGrid>

Example of the UniformGrid control

Read the docs for the full documentation.

InfiniteCanvas

The InfiniteCanvas control is a canvas that supports infinite scrolling, inking, formatted text, zooming, undo and redo, and exporting and importing canvas data.

Example of the InfiniteCanvas control

Read the docs for the full documentation.

There is more…

Make sure to visit the release notes to get the full list of updates for this release. The community continues to work enthusiastically together to add value for all Windows developers and version 3.0 is the biggest effort so far. The toolkit would not be possible if it were not for the hard work of our contributors.

As a reminder, you can get started by following this tutorial, or preview the latest features by installing the UWP Community Toolkit Sample App from the Microsoft Store. If you would like to contribute, please join us on GitHub! To join the conversation on Twitter, use the #windowstoolkit hashtag.

Happy coding!

The post Announcing Windows Community Toolkit v3.0 appeared first on Windows Developer Blog.

Automatically change your Audio Input, Output and Volume per application in Windows 10

$
0
0

I recently blogged about an amazing little utility called AudioSwitcher that makes it two-clicks easy to switch your audio inputs and outputs. I need to switch audio devices a lot as I'm either watching video, doing a podcast, doing a conference call, playing a game, etc. That's at least three different "scenarios" for my audio setup. I've got 5 inputs and 5 outputs and I've seen PC audiophiles with even more. So I set up this AudioSwitcher and figured, cool, solved that silly problem.

Then a little birdie said that I should look closer at Windows 10 itself. What? I know this OS like the back of my hand! Nonsense!

Hit the Start Menu and search for either "Sound Mixer" or "App Volume"

Sound mixer options

There's a page that does double duty called App Volume and Device Preferences.

You can also get to it from the regular Settings | Audio page:

change the device or app volume

See where it says "Change the device or app volume?" Ok, now DRINK THIS IN.

You can set the volume in active apps on an app-by-app basis. Cool. NOT IMPRESSED ARE YOU? Of course not, because while that's a lovely feature it's not the hidden power I'm talking about.

You can set the Preferred Input and Output device on an App by App Basis.

App Volume and Device Preferences

You can set the Preferred Input and Output device on an App by App Basis.

Read that again. I'll wait.

Rather than me constantly using the Audio Switcher (lovely as it is) I'll just set my ins and outs for each app.

The only catch is that this list only shows the apps that are currently using the mic/speaker, so if you want to get a nice setup, you'll want to run apps in order to change the settings for your app.

  • Here I've got the system sounds running through Default (usually the main speakers and the default mic is a webcam)
  • The Speech Runtime (I use WIN+H to use Windows 10 built-in Dragon-Naturally-Style-But-Not free dictation in any app) uses the Webcam mic explicitly as it has the best recognition in my experience.
  • Skype for Business is now using the phone. You can certainly set these things in the apps themselves, but in my experience Skype for Business doesn't care about your feelings or your audio settings. ;)
  • I record my podcast with Zencastr so I've setup Chrome for my preferred/optimal settings.

I can still use the AudioSwitcher but now my defaults are contextual so I'm switching a LOT LESS.

What do you think? Did YOU know this existed?


Sponsor: Learn how .NET in 2018 addresses the challenges developers are working on with future-focused technology. Get the new whitepaper on "The State of .NET in 2018" by the Progress Telerik team!



© 2018 Scott Hanselman. All rights reserved.
     

Receiving and handling HTTP requests anywhere with the Azure Relay

$
0
0

If you followed Microsoft’s coverage from the Build 2018 conference, you may have been as excited as we were about the new Visual Studio Live Share feature that allows instant, remote, peer-to-peer collaboration between Visual Studio users, no matter where they are. One developer could be sitting in a coffee shop and another on a plane with in-flight WiFi, and yet both can collaborate directly on code.

The "networking magic" that enables the Visual Studio team to offer this feature is the Azure Relay, which is a part of the messaging services family along with Azure Service Bus, Azure Event Hubs, and Azure Event Grid. The Relay is, indeed, the oldest of all Azure services, with the earliest public incubation having started exactly 12 years ago today, and it was amongst the handful of original services that launched with the Azure platform in January 2010.

In the meantime, the Relay has learned to speak a fully documented open protocol that can work with any WebSocket client stack, and allows any such client to become a listener for inbound connections from other clients, without needing inbound firewall rules, public IP addresses, or DNS registrations. Since all inbound communication terminates inside the application layer instead of far down at the network link level, the Relay is also an all around more secure solution for reaching into individual apps than using virtual private network (VPN) technology.

In time for the Build 2018 conference, the Relay learned a brand-new trick that you may have missed learning about amidst the torrent of other Azure news, so we’re telling you about it again today: The Relay now also supports relayed HTTP requests.

This feature is very interesting for applications or application components that run inside of containers and where it's difficult to provide a public endpoint, and is especially well-suited for implementing Webhooks that can be integrated with Azure Event Grid.

The Relay is commonly used in scenarios where applications or devices must be reached behind firewalls. Typical application scenarios include the integration of cloud-based SaaS applications with points of sale or service (shops, coffee bars, restaurants, tanning salons, gyms, repair shops) or with professional services offices (tax advisors, law offices, medical clinics). Furthermore, the Relay is increasingly popular with corporate IT departments who use relay based communication paths instead of complex VPN setups. We will have more news regarding such scenarios in the near future.

The new HTTP support lets you can create and host a publicly reachable HTTP(-S) listener anywhere, even on your phone or any other device, and leave it to the Azure Relay service to provide a resolvable DNS name, a TLS server certificate, and a publicly accessible IP address. All your application needs is outbound Websocket connectivity to the Azure Relay over the common HTTPS port 443.

For illustration, let’s take a brief look at a simple Node.js example. First, here’s a minimal local HTTP listener built with Node.js “out of the box”:

var http = require('http');
var port = process.env.PORT || 1337;

http.createServer(function (req, res) {
     res.writeHead(200, { 'Content-Type': 'text/plain' });
     res.end('Hello Worldn');
}).listen(port);

This is the equivalent Node.js application using the Azure Relay:

var http = require('hyco-https');

var uri = http.createRelayListenUri("cvbuild.servicebus.windows.net", "app");
var server = http.createRelayedServer({
       server: uri,
       token: () => http.createRelayToken(uri, "listen", "{…key…}")
     },
     function (req, res) {
         res.writeHead(200, { 'Content-Type': 'text/plain' });
         res.end('Hello Worldn');
}).listen();

The key changes are that the Relay application uses the ‘hyco-https’ module instead of Node.js’ built-in ‘http’ module, and that the server is created using the ‘createRelayedServer’ method supplying the endpoint and security token information for connecting to the Relay. Most important: The Node.js HTTP handler code is completely identical.

For .NET Standard, we have extended the existing Relay API in the latest preview of the Microsoft.Azure.Relay NuGet package to also support handling HTTP requests. You create a HybridConnectionListener as you do for Websocket connections, and then just add a RequestHandler callback.

var listener = new HybridConnectionListener(uri, tokenProvider);
listener.RequestHandler = (context) =>
{
     context.Response.StatusCode = HttpStatusCode.OK;
     context.Response.StatusDescription = "OK";
     using (var sw = new StreamWriter(context.Response.OutputStream))
     {
         sw.WriteLine("hello!");
     }
     context.Response.Close();
};

If you want to have existing ASP.NET Core services listen for requests on the Relay, you use the new Microsoft.Azure.Relay.AspNetCore NuGet package that was just released and that allows hosting existing ASP.NET Core apps behind the Relay by adding the "UseAzureRelay()" extension to the web host builder and configuring the connection string of a Hybrid Connection shared access rule (see readme, more samples).

    public static IWebHost BuildWebHost(string[] args) =>
             WebHost.CreateDefaultBuilder(args)
                 .UseStartup<Startup>()
                 .UseAzureRelay(options =>
                 {
                     options.UrlPrefixes.Add(connectionString);
                 })
                 .Build();

The HTTP feature is now in preview in production, meaning you can use it alongside the existing Relay features, complete with support and SLA. Because it’s in preview and not yet entirely in its final shape, we might still make substantial changes to the HTTP-related wire protocol.

Since the Relay isn’t a regular reverse proxy, there are some lower level HTTP details that the Relay overrides and that we will eventually try to align. An exemplary known issue of that sort is that the Relay will always transform HTTP responses into using chunked transfer encoding; while that is might be fairly annoying for protocol purists, it doesn’t have material impact on most application-level use cases.

To get started, check out the tutorials for C# or Node.js and then let us know via the feedback option below the tutorials whether you like the new HTTP feature.

Make Azure IoT Hub C SDK work on tiny devices!

$
0
0

Azure IoT Hub C SDK is written in ANSI C (C99), which makes it well-suited for a variety of platforms with small disk and memory footprint. We recommend at least 64KB of RAM, but the exact memory footprint depends on the protocol used, the number of connections opened, as well as the platform targeted. This blog walks through how to optimize the C SDK for constrained devices.

We release our C SDK as packages on apt-get, NuGet and MBED to accelerate the development process. However, if your system is constrained in ROM or RAM, you may want to build the SDK locally and remove certain features to shrink the footprint of the C SDK. We will be using cmake to demonstrate in this blog. In addition, the programming model for working with constrained devices is different. This blog will also discuss some best practices to reduce memory consumption. There is also official documentation on how to develop for constrained devices available to you.

Building the C SDK for constrained devices

First, you need to prepare your development environment following this guide. When you get to the step for building with cmake, you can invoke flags to remove certain features.

TL;DR: If you are building for a constrained device, consider using this cmake command:

cmake -Duse_amqp=OFF -Duse_http=OFF -Dno_logging=OFF -Ddont_use_uploadtoblob=ON <Path_to_cmake>

Remove additional protocol libraries

Our SDKs support five protocols today: MQTT, MQTT over WebSocket, AMQPs, AMQP over WebSocket, and HTTPS. Most of our customers have one to two protocols running on a client, hence you can remove the protocol library you are not using from the SDK. There is additional information about choosing the appropriate communication protocol for your scenario. For example, MQTT is a lightweight protocol that is often better suited for constrained devices. In testing, we found that using MQTT instead of AMQP can shrink ROM usage by 50 percent and RAM usage by 20 percent*.

You can remove AMQP and HTTP libraries using the following cmake command:

cmake -Duse_amqp=OFF -Duse_http=OFF <Path_to_cmake>

Remove SDK logging capability

The C SDK provides extensive logging throughout to help with debugging. In testing, we found that disabling logging can shrink ROM usage by 20 percent*. You can remove the logging capability for production devices using the following cmake command:

cmake -Dno_logging=OFF <Path_to_cmake>

Remove upload to blob capability

You can upload large files to Azure Storage using the built-in capability in the SDK. Azure IoT Hub acts as a dispatcher to an associated Azure Storage account. You can use this feature to send media files, large telemetry batches, and logs. Learn more about how to upload files with IoT Hub. If your application does not require this functionality, you can remove this feature using the following cmake command:

cmake -Ddont_use_uploadtoblob=ON <Path_to_cmake>

Running strip on Linux environment

If your binaries run on Linux system, you can leverage the strip command to reduce the size of the final application after compiling.

strip -s <Path_to_executable>

Programming models for constrained devices

Avoid using the Serializer

The C SDK has an optional serializer which allows you to use declarative mapping tables to define methods and device twin properties. This is designed to simplify development, but it adds overhead, which is not optimal for constrained devices. In this case, consider using primitive client APIs and parse json by using a lightweight parser such as parson.

Use the lower layer (_LL_) layer

The C SDK supports two programming models. One set has APIs with an _LL_ infix, which stands for lower layer. This set of APIs are lighter weight and do not spin up worker threads. You must manually control scheduling. The _LL_APIs for device client can be found in this header file. Another set of APIs without the _LL_ index is called the convenience layer, where a worker thread is spun automatically. For example, the convenience layer APIs for device client can be found in this header file. For constrained devices where each extra thread can take a substantial percentage of system resources, consider using _LL_ APIs.

To learn more about Azure IoT C SDK architecture:

*ROM and RAM consumption savings are approximations. The exact usage depends on the protocol used, the number of connections opened, as well as the platform targeted.


VNet service endpoints for Azure database services for MySQL and PostgreSQL in preview

$
0
0

This blog post was co-authored by Anitha Adusumilli, Principal Program Manager, Azure Networking.

We recently made Azure database services for MySQL and PostgreSQL generally available. These services offer the community versions of MySQL and PostgreSQL with built-in high availability, a 99.99% availability SLA, elastic scaling for performance, and industry leading security and compliance on Azure. Since general availability, we have continued to bring new features and capabilities like increased storage and availability across more regions worldwide.

We are excited to announce the public preview of Virtual Network (VNet) service endpoints for Azure Database for MySQL and PostgreSQL in all regions where the service is available. Visit region expansion for MySQL and PostgreSQL for service availability. VNet service endpoints enable you to isolate connectivity to your logical server from only a given subnet or set of subnets within your virtual network. The traffic to Azure Database for MySQL and/or PostgreSQL from your VNet always stays within the Azure backbone network. Preference for this direct route is over any specific ones that route Internet traffic through virtual appliances or on-premises.

There is no additional billing for virtual network access through service endpoints. The current pricing model for Azure Database for MySQL and PostgreSQL applies as is.

Image 1

Using firewall rules and VNet service endpoints together

Turning on VNet service endpoints does not override firewall rules that you have provisioned on your Azure Database for MySQL or PostgreSQL. Both continue to be applicable.

VNet service endpoints don’t extend to on-premises. To allow access from on-premises, firewall rules can be used to limit connectivity only to your public (NAT) IPs.

To enable VNet protection, visit these articles for Azure Database for MySQL and PostgreSQL.

Turning on service endpoints for servers with pre-existing firewall rules

When you connect to your server with service endpoints turned on, the source IP of database connections switches to the private IP space of your VNet. Configuration is via the “Microsoft.Sql” shared service tag for all Azure Databases including Azure Database for MySQL, PostgreSQL, Azure SQL Database Managed Instance, and Azure SQL Data Warehouse. If at present, your server or database firewall rules allow specific Azure public IPs, then the connectivity breaks until you allow the given VNet/subnet by specifying it in the VNet firewall rules. To ensure connectivity, you can preemptively specify VNet firewall rules before turning on service endpoints by using IgnoreMissingServiceEndpoint flag.

Pre-existing firewall rules

Support for App Service Environment

As part of the public preview, we support service endpoints for App Service Environment (ASE) subnets deployed into your VNets.

Next steps

To get started, refer to the documentation “Virtual Network Service Endpoints” and “How to configure VNET Service Endpoints for MySQL and PostgreSQL”.

For feature details and scenarios, please watch the Microsoft Ignite session, Network security for applications in Azure.

Azure Security Center can identify attacks targeting Azure App Service applications

$
0
0

One of Azure’s most popular service is App Service which enables customers to build and host web applications in the programming language of their choice without managing infrastructure. App Service offers auto-scaling and high availability, supports both Windows and Linux. It also supports automated deployments from GitHub, Visual Studio Team Services or any Git repository. At RSA, we announced that Azure Security Center leverages the scale of the cloud to identify attacks targeting App Service applications.

Vulnerabilities in web applications are frequently exploited by attackers, as they are a common and dynamic interface for almost every organization on the internet. Requests to applications running on top of App Service go through several gateways deployed in Azure datacenters around the world, responsible for routing each request to its corresponding application. Recently, Security Center and App Service embarked on a journey aimed at building a security offering to support App Service customers.

By leveraging the visibility that Azure has as cloud provider, Security Center analyzes App Service internal logs to identify attack methodology on multiple targets. For example, attempts to access the same Uniform Resource Identifiers (URI) on various web sites. This type of attacker typically exhibits a pattern of crawling to the same web page on multiple web sites, searching for a particularly vulnerable page or plugin. Security Center can detect and alert on behavior that may pass through Web Application Firewalls (WAF) instruments.

WAFs are a widespread protection mechanism employed to protect web applications. WAFs typically perform rule-based string analysis to the requested URIs, and successfully identify prevalent attacks. Examples include: SQL injection, cross-site scripting and other HTTP violations. While successfully mitigating known attacks, the rule-based approach fails to keep up with the rate of new vulnerabilities.

Security Center leverages the scale of the cloud to identify attacks on App Service applications while focusing on emerging attacks, as attackers are on the reconnaissance phase, scanning to identify vulnerabilities across multiple websites, hosted on Azure. Security Center’s new capabilities include analytics and machine learning models that cover all interfaces allowing customers to interact with their applications, whether it’s over HTTP or through one of the management methods. Moreover, as a first-party service in Azure, Security Center is also in a unique position to offer host-based security analytics covering the underlying compute nodes for this PaaS. Thus, Security Center can detect attacks against web applications that has already been exploited.

image

Managing web applications can expose your workload to security vulnerabilities, which needs to be thought of and address. These vulnerabilities can lead to your web applications exploited and attackers gaining a footprint in your workload. Security Center offers a set of recommendations both for windows and Linux web applications with security best practices that will help guide you of the process of configuring your web applications correctly.

A public preview of this feature is available now on the Standard and Trial tiers of Security Center at no additional cost.

To learn more about Security Center threat detection, see the documentation or review case studies from security researchers about how Security Center detects SQL Brute Force attacks, Bitcoin mining, DDoS attack using cyber threat intelligence, and good applications being used maliciously.

One Week Left to Take State of DevOps Survey

$
0
0
Folks, the State of DevOps Survey closed June 8th. If you haven’t yet, please click this link: https://bit.ly/2FCG8Me  I just reread the results from prior years in Accelerate and I was struck by the findings in Chapter 5. The differences in velocity among high, medium- and low-performers are well known and I commented on these... Read More

New round of R Consortium grants announced

$
0
0

The R Consortium has just announced its latest round of project grants. After reviewing the proposals submitted by the R community, the Infrastructure Steering Committee has elected to fund the following projects for the Spring 2018 season:

  • Further updates to the DBI package, to provide a consistent interface between R and databases
  • Updating the infrastructure in R for building binary packages on Windows and Mac
  • A collaboration with Statisticians in the Pharmaceutical Industry to validate R packages to regulatory standards
  • A project to validate and maintain some of the historic numerical algorithms used by R 
  • A working group to facilitate working with US census data in R
  • Consolidation of tools and workflows for handling missing data in R
  • Templates and tools for the development of teaching resources with R

This latest round of grants brings the total funding for projects proposed by R community members to more than $650,000. The R Consortium itself is funded by its members (including my own employer, Microsoft), so if you'd like to see more such projects why not ask your own employer to become a member

R Consortium: Announcing the R Consortium ISC Funded Project grant recipients for Spring 2018

 

Recap: Microsoft at PyCon US 2018

$
0
0

Microsoft was a keystone-level sponsor of PyCon US 2018 this year, which took place in Cleveland Ohio from May 9-17th. We had a great time interacting with and learning from the community, and we had a lot of fun! In this post we’ll share our experience and useful links from the conference.

Language Summit

The language summit is an annual gathering at PyCon US where various core contributors to the Python language and CPython interpreter discuss various topics in person instead of over email. Contributors and invited guests make presentations about some topic that they want feedback on, and then there is a discussion in the room about what was just presented. Being a distributed, open source team means that getting to have these in-person interactions and discussions is a rare opportunity to not only build bonds between individuals but to also rapidly discuss things rather than waiting on emails being read in various time zones around the globe.

As core developers of Python, Brett Cannon, Dino Viehland, Eric Snow, Steve Dower, and Barry Warsaw all attended the language summit. Brett helped lead a discussion on increasing the diversity of the development team as well as potentially porting the Python launcher to UNIX (it currently only exists for Windows).

Eric presented first, talking about his ongoing project to improve CPython's multi-core story (slides here).  This project involves exposing the existing C-API for subinterpreters (multiple Python interpreters in a single process) in the stdlib, adding a mechanism for safely sharing data, and then stop sharing the GIL between subinterpreters.  This accomplishes two valuable things: a CSP-like concurrency model and true multi-core parallelism.  The project received a lot of positive feedback throughout the conference.

Eric talking at the Language Summit, photo courtesy of Jake Edge and LWN.net

Eric also presented a brief lightning talk at the language summit about the C-API.  In particular, he urged the core team to plan out soon what the future of the C-API should be.  The motivator is the stumbling block that the C-API is to several desirable improvements to the runtime.

Various recaps of everything that was discussed at the language summit have been published on LWN.

Sessions

Microsoft held two sponsor workshops at the conference. First, was Standardized Data Science: The Team Data Science Data Process by Buck Woody, the materials and sample notebook from this presentation are hosted on Azure Notebooks (tip: sign in to clone the library and run through the sample notebook!).

The second workshop was Python with Visual Studio and Visual Studio Code by Steve Dower, if you missed it at PyCon you can watch the version of the talk we gave at Microsoft Build: Get Productive with Python Developer Tools.

During the regular conference sessions, Nina Zhakarenko gave a great talk Elegant Solutions for Everyday Python Problems with many neat and useful Python coding tips. Be sure to check out the slides if you want a quick reference!

Nina Zhakarenko’s talk was so busy attendees were sitting on the floor!

Brett Cannon, the developer lead for the Python Extension in VS Code, gave a keynote presentation on Setting Expectations for Open Source Contributions. If you’re contributing to open source projects, you should check it out!

Brett’s keynote reminds everyone to be kind on open source projects

We also hosted a few open spaces on: Python on Windows, Python on Visual Studio Code, and Python in SQL Server. We had great ad-hoc discussions and learned a lot at these open spaces, so definitely keep an eye out for these at future conferences.

Booth + Labs

The Microsoft booth had demo stations for Python tools and services at Microsoft, as well as some hands-on-labs for conference attendees to get hands-on. We’ll share some links if you want to try out the tools yourself.

Microsoft booth and hands on labs at PyCon US 2018

Below are just some of the demos/products we showed at our booth:

  • Python in Visual Studio Code: our free, cross-platform, and open-source tools for Python developers.
  • Python in Visual Studio: our fully featured Python IDE for Windows developers, and our Ogre3d Embedded Python demo showing debugging across C++ and Python.
  • Azure Web Apps for Containers flask quickstart: using VS Code with the Docker and Azure App Service extensions in VS Code to dockerize a Flask app and publish it to Azure App Service.
  • Visual Studio Live Share: real-time collaborative development across Visual Studio and Visual Studio Code, now with support for Python!
  • Azure Notebooks: our online Jupyter notebooks service with free hosting and sharing
  • Python and ML in SQL Server: embedded python support in SQL server for running intelligent algorithms over your data.
  • Python in Visual Studio Team Services: we showed continuous deployment of Python apps as well as the new support for building and testing Python apps multi-platform Python builds on VSTS. We’ll be sharing more about this in the coming weeks!

We also had a photo booth, where many fun pictures were taken. Here’s a fun shot of some of the Microsoft team at the booth:

PyLadies Auction

If you haven’t been to the PyLadies auction before it is definitely an experience and an opportunity to get some one-of-a-kind Python items and donate to the PyLadies foundation.

This year Microsoft donated a stained glass Python logo which went for $2048, and a set of earrings that went for $512!

Photo courtesy of @pyladies

This year the PyLadies auction raised a whopping $30,000 in total!

Developer Sprints

Most of our team stuck around for developer sprints, where we collaborated with others in the community on various open source projects.

Our team was working on the Python extension for VS Code. We had 3 contributors help out with the project and contributed a couple of features, namely:

  • Gevent debugging in the experimental debugger (1699).
  • Syntax highlighting for constraints files (1685).
  • Ability to disable automatic running of tests when a test file is saved (1687).
  • Update to the contributions document with setting up of the extension environment (1684)

Shout out to Bence Nagy, Waleed Sehgal, Larry Li for your contributions! We also added support for getting started with PyBee in VS Code, with the ability to create, build and run a project.

During the sprints we also got pull request and continuous integration builds of CPython up and running with hosted build time donated by Microsoft. You can see the builds linked from any recent pull request or by browsing the Visual Studio Team Services instance. These builds will enable the CPython developers to review and merge pull requests faster.

We were also working on our PyJion JIT compiler for Python, it is now runs cross-platform and has a docker container set-up making it easier to develop and contribute. Thanks to Dusty Phillips for contributing!

See you next time!

We hope to see many of you next time at PyCon 2019, which is in Cleveland from May 1-10th. Our team will also be at EuroPython in Edinburgh this summer from July 23-29!

Viewing all 10804 articles
Browse latest View live