This blog was written by Rinku Sreedhar, Senior Program Manager
In Windows 10 we added a multiple new sensor features, innovations and changes. In my previous blog, I talked about the new contextual sensing features like activity sensors, pedometers, barometer and proximity sensors. In this post, I will give an overview of three additional sensor features in Windows 10: sensor batching, ReadingTransform and custom sensors.
Sensor batching:
If you have been trying to create an app for sleep monitoring but are concerned about the power impacts, here is a solution for you. In Windows 10 we introduced Sensor Batching for accelerometer. What is Sensor Batching? It is a sensor that implements batching capable of buffering sensor samples in sensor hardware and delivering them in a batch instead of delivering continuously. This allows the application processor to save power as it can wake-up less frequently to receive sensor samples together in a batch rather than stay awake to process samples at data intervals.
The diagram below shows how data is collected and then delivered, both continuous delivery as well as batched delivery.
With batching, accelerometer now has 2 additional properties:
- MaxBatchSize : The maximum number of events that the sensor can cache before it is forced to send them. If this is zero, then batching is not supported by the sensor. The actual size may be smaller than this number size since the batch FIFO can be shared by multiple sensors.
- ReportLatency: This will allow an application to influence how often the sensor sends batches by adjusting the latency. When this property is set for an application, the sensor will send data after the specified amount of time. You can control how much data is accumulated over a given latency by setting the ReportInterval property. Setting this is optional for applications and also optional for sensors to support.
Key points to note:
- Batching is recommended when the workflow is mostly happening in the background, with limited interactions in the foreground / on-screen.
- Maximum ReportLatency = MaxBatchsize * ReportInterval. If you specify a higher value than this, then the maximum report latency will used so you do not lose data.
- If MaxBatchsize = zero, then Batching is not supported by the Sensor.
- Multiple application may set a desired latency, in this case shortest latency period will be used.
API Pattern is as shown below:
// Select a report interval and report latency that is both suitable for the purposes of the app and supported by the sensor. // This value will be used later to activate the sensor. uint32 minReportInterval = accelerometer->MinimumReportInterval; desiredReportInterval = minReportInterval > 16 ? minReportInterval : 16; // MaxBatchSize will be 0 if the accelerometer does not support batching. uint32 maxSupportedLatency = desiredReportInterval * accelerometer->MaxBatchSize; desiredReportLatency = maxSupportedLatency < 10000 ? maxSupportedLatency : 10000; // Establish the report interval accelerometer->ReportInterval = desiredReportInterval; // Establish the report latency. This is a no-op if the accelerometer does not support batching accelerometer->ReportLatency = desiredReportLatency;
More detailed API reference for Accelerometer with the new properties introduced for Batching can be found here and UWP SDK Samples can be found here.
ReadingTransform
Were you ever frustrated with aligning your app to the display orientation you wanted? it’s now very easy for you to do with a one line code change with Windows 10 UWP APIs.
Here’s more information and a bit of a background on this. Windows desktop and Windows Phone define their sensor coordinate systems differently – due to this, you were required to write extensive code to transform the sensor data coordinates with the display orientation.
Most sensor’s (like accelerometer, gyroscope, magnetometer etc.) data comprises the readings in X, Y and Z axes, which depends on the way the sensor was integrated to the system.
All landscape-first devices integrate sensors in such a way that their X-axis is along the longer edge and Y-axis is along the shorter edge of the device. Z-axis is perpendicular to the display.
However, all portrait-first devices integrate the sensors in such a way that their X-axis is along the shorter edge and Y-axis is along the longer edge of the device. Z-axis remains perpendicular to the display.
This made it quite complex to have your app run on multiple devices as you were required to transform the sensor data to the required display orientation. In Windows 10, we added a new property called ReadingTransform that allows you to specify the display orientation you want to align the Sensor data.
Now it’s easy to do the following:
- To have your UWP apps align with the display orientation you specify to get it work across the entire family of Windows 10 devices, including desktop, phone and Xbox, HoloLens, SurfaceHub and IoT.
- To convert your existing desktop or Phone app to a UWP app that will work across multiple device families.
Here is the C++ sample code change required to make this happen, using accelerometer as an example:
void Scenario4_OrientationChanged::OnOrientationChanged(Windows::Graphics::Display::DisplayInformation ^sender, Platform::Object ^args) { if (nullptr != accelerometer) { accelerometer->ReadingTransform = sender->CurrentOrientation; } }
For more info, here is the complete sample on Github
Few key points to note:
- In Addition to Accelerometer, Gyroscope, Magnetometer, Compass, Orientation, Inclinometer and SimpleOrientation supports ReadingTransform.
- The app should initialize the value to their assumed orientation, i.e. so if you developed the app assuming Landscape devices, you should initialize it to ‘Landscape’.
- The app also needs to register for OrientationChanged callback, so you are notified when orientation of the display or monitor changes.
Custom Sensors
Starting Windows 10, hardware manufacturers have the ability to add new sensor types that don’t have a first class representation , for example air quality sensor, temperature sensor, heart rate sensors etc. Custom sensors provide a generic API through which IHVs can expose any type of sensors and release them independent of Microsoft’s OS ship cycle. Partners who used the Win32 sensors API for custom sensors can now develop Windows Store apps without modifying their hardware, and without the complexity of using low-level HID.
Key points to note:
- Added a new namespace Windows.Devices.Sensors.Custom for Custom Sensors.
- Defined the new CustomSensor, CustomSensorReadingChangeEventArgs and CustomSensorReading classes modeled very similar to the other sensor classes (Accelerometer, Gyroscope, Magnetometer…).
- The CustomSensorReading class contains a list property of key/value pairs that contain the custom data sent from the sensor’s driver to UWP app (key is a string representing a PROPERTYKEY, values can be integers, Booleans, floats or double) .
- The CustomSensor class reuses the same eventing mechanisms as the other native sensor classes.
- The CustomSensor class also reuses the same common properties (DeviceId, ReportInterval) as for the other native sensor classes.
API Pattern is as shown below:
// The following ID is defined by vendors and is unique to a custom sensor type. Each custom sensor driver should define one unique ID. // // The ID below is defined in the custom sensor driver sample available in the SDK. It identifies the custom sensor CO2 emulation sample driver. Guid GUIDCustomSensorDeviceVendorDefinedTypeID = new Guid("4025a865-638c-43aa-a688-98580961eeae"); // A property key is defined by vendors for each datafield property a custom sensor driver exposes. Property keys are defined // per custom sensor driver and is unique to each custom sensor type. // // The following example shows a property key representing the CO2 level as defined in the custom sensor CO2 emulation driver sample available in the WDK. // In this example only one key is defined, but other drivers may define more than one key by rev'ing up the property key index. const String CO2LevelKey = "{74879888-a3cc-45c6-9ea9-058838256433} 1"; // A pointer back to the main page. This is needed if you want to call methods in MainPage such as NotifyUser() MainPage rootPage = MainPage.Current; private CustomSensor customSensor; private uint desiredReportInterval; private DeviceWatcher watcher; public Scenario1_DataEvents() { String customSensorSelector = ""; this.InitializeComponent(); customSensorSelector = CustomSensor.GetDeviceSelector(GUIDCustomSensorDeviceVendorDefinedTypeID); watcher = DeviceInformation.CreateWatcher(customSensorSelector); watcher.Added += OnCustomSensorAdded; watcher.Start(); /// Invoked when the device watcher finds a matching custom sensor device /// /// device watcher /// device information for the custom sensor that was found private async void OnCustomSensorAdded(DeviceWatcher watcher, DeviceInformation customSensorDevice) { try { customSensor = await CustomSensor.FromIdAsync(customSensorDevice.Id); } catch(Exception e) { await Dispatcher.RunAsync(CoreDispatcherPriority.Normal, () => { rootPage.NotifyUser("The user may have denied access to the custom sensor. Error: " + e.Message, NotifyType.ErrorMessage); }); } } }
More detailed API reference can be found here and UWP SDK samples can be found here
If you are interested in learning more about the new Windows 10 Sensor features, please check out the Build 2015 session on sensors. If you find bugs or issues, please leverage the Windows Feedback tool or MSDN forums. If you would like us to add any new features, please submit them over at UserVoice or provide as comments below.