By adding Capture to Event Hubs, cloud customers get methods for data retention and downstream micro-batch processing. It allows the ability to pull data directly from Event Hubs to Azure Data Lake Store. With Capture, streaming data in Event Hubs can be automatically delivered to Azure Blob storage and Data Lake Store. This provides important flexibility when selecting time or size intervals. Microsoft says using Event Hubs is the simplest method to load streaming data in Azure. In its announcement today, Microsoft described how the integration works: “Capture will manage all the compute and downstream processing required to do this. Create your Azure Data Lake Store and set up appropriate permissions for your event hub that has Capture enabled, and you will see how easy it is to stream data into Azure.”

Integration

The company says the Data Lake Store provider can be implemented on Azure Portal through the Capture Provider. Microsoft says it can also be enabled through Azure Resource Manager templates. Once enabled, users choose the time and size of the window with Azure Data Lake Store as the provider. When the destination is selected, events will be captured with the provider. In its announcement, the company says the Capture is easy to set up and is affordable for customers. The feature also comes with no configuration overheads, giving real-time batch analytics: “Unleash the power of Azure Data Lake Store for your big data requirements at real-time or batch processing and visualization. With Event Hubs Capture streaming data, you can now optimize your data analysis and visualization.”

Microsoft Introduces Event Hubs Capture for Azure Data Lake Store - 43Microsoft Introduces Event Hubs Capture for Azure Data Lake Store - 9Microsoft Introduces Event Hubs Capture for Azure Data Lake Store - 91Microsoft Introduces Event Hubs Capture for Azure Data Lake Store - 29Microsoft Introduces Event Hubs Capture for Azure Data Lake Store - 24