Presented by Tom Silverstrim, Sr. Manager of Analytics and Monitoring, Adobe. Adobe Digital Experience group has created the Experience Platform, the foundation of Experience Cloud products, as an open system that transforms all your data —Adobe and non-Adobe — into robust customer profiles. Through the development of Experience Platform, data pipeline engineering has emerged as a key organizational capability in line with developments across the delivery layers for software services. Consider: system layer transitions to Cloud, Containers, k8 environments, platform layer with the rise of Distributed Service architectures and API Ecosystem application layer where the expectation is a predictable and accurate Machine Learning system A consequence of these changes the need to alert and monitor the quality of data processing and movement across cloud and on-premise environments with compute and storage services working in both streaming and batch modes. The expectation is that complex data processing workloads are handled by pipelines which are fault tolerant, repeatable, and highly available. Customers demand that we ensure resource availability, manage inter-task dependencies, retry transient failures or timeouts in individual tasks, or create a failure transmitted via notification system. The expectation that solutions involved in complex pipelines will support interfaces allowing integration of the solution into the broader pipeline. This article discusses how to use AEP’s Webhook interfaces to automate dataset monitoring.
Free access to Qubole for 30 days to build data pipelines, bring machine learning to production, and analyze any data type from any data source.