News

A leader who is making progress on these issues is Ujjawal Nayak, whose achievements in cloud and data lake projects led to ...
Who needs rewrites? This metadata-powered architecture fuses AI and ETL so smoothly, it turns pipelines into self-evolving ...
Apache Airflow is a great data pipeline as code, but having most of its contributors work for Astronomer is another example of a problem with open source.
Essentially, AWS Data Pipeline is a way to automate the movement and transformation of data to make the workflows reliable and consistent, regardless of the infrastructure of data repository changes.
Since Apache Airflow is an open-source tool, it plays an instrumental role in making data orchestration a reality by taking pain points away.
The infrastructure behind AI agents isn't static—it’s a living, evolving system. Designing effective data pipelines means ...
Apache Airflow instances that have not been properly secured are exposing everything from Slack to AWS credentials online.
Database startup InfluxData Inc. is adding new functions to its platform today that enable users to collect, process and store time-series data in the InfluxDB Cloud more rapidly. InfluxDB Native ...
Apache Airflow is one of the world’s most popular open source tools for building and managing data pipelines, with around 16 million downloads per month. Those users will see several compelling new ...
SEATTLE, Nov. 25, 2020 — Amazon Web Services, Inc. (AWS), an Amazon.com company (NASDAQ: AMZN), announced the general availability of Amazon Managed Workflows for Apache Airflow (MWAA), a new managed ...