This is an automated email from the ASF dual-hosted git repository.
potiuk pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow-site.git
The following commit(s) were added to refs/heads/main by this push:
new 2788980aff Add adjoe use case (#1072)
2788980aff is described below
commit 2788980affe6f9bcc6f08c792c6a218cd286e4d3
Author: Tadeh Alexani <[email protected]>
AuthorDate: Wed Oct 2 01:43:04 2024 +0200
Add adjoe use case (#1072)
* Add adjoe use case
* Update adjoe-logo.svg
---------
Co-authored-by: Tadeh Alexani <[email protected]>
---
landing-pages/site/content/en/use-cases/adjoe.md | 20 ++++++++++++++++++++
.../site/static/usecase-logos/adjoe-logo.svg | 1 +
2 files changed, 21 insertions(+)
diff --git a/landing-pages/site/content/en/use-cases/adjoe.md
b/landing-pages/site/content/en/use-cases/adjoe.md
new file mode 100644
index 0000000000..46e3b7dc9a
--- /dev/null
+++ b/landing-pages/site/content/en/use-cases/adjoe.md
@@ -0,0 +1,20 @@
+---
+title: "adjoe"
+linkTitle: "adjoe"
+quote:
+ text: "Deploying Airflow allowed us to efficiently manage workloads with
multiple DAGs, from generating reports and system analyses to training machine
learning models and preparing datasets."
+ author: "Tadeh Alexani"
+logo: "adjoe-logo.svg"
+blocktype: "testimonial"
+---
+
+##### What was the problem?
+Before adopting Airflow at adjoe, we handled job scheduling in two main ways:
by setting up Kubernetes cronjobs or building AWS Lambda functions. While both
approaches had their benefits, they also came with limitations, especially when
it came to managing more complex workloads. As our data science teams needs
evolved, it became clear that we needed a more robust and flexible
orchestration tool.
+
+##### How did Apache Airflow help to solve this problem?
+With the creation of a new AWS environment for the data science teams, we
introduced Airflow on Kubernetes as our primary orchestration solution,
addressing both stability and scalability requirements.
+
+After deploying Airflow in our staging and production environments, we were
able to create multiple DAGs to manage and schedule a variety of workloads
efficiently. These range from generating and emailing daily reports to
performing system analyses, training complex machine learning models using the
Spark Operator or Kubeflow’s Training Operator for GPU models, and preparing
datasets using Airflow’s ETL capabilities.
+
+##### What are the results?
+By implementing Airflow, our data scientists can now manage and schedule their
jobs more efficiently. Monitoring job statuses has become simpler, thanks to an
intuitive interface that also provides easy access to logs. The need for
infrastructure management has significantly reduced, allowing data scientists
to test and deploy their DAGs independently, which in turn has accelerated
development for both teams. Currently, our Data Science teams manages over 20
DAGs and more than 50 tasks, [...]
diff --git a/landing-pages/site/static/usecase-logos/adjoe-logo.svg
b/landing-pages/site/static/usecase-logos/adjoe-logo.svg
new file mode 100644
index 0000000000..97e6cfbaff
--- /dev/null
+++ b/landing-pages/site/static/usecase-logos/adjoe-logo.svg
@@ -0,0 +1 @@
+<svg width='144' height='40' fill='none'
xmlns='http://www.w3.org/2000/svg'><path d='M64.15
10.38v20.587h-4.62V28.19c-1.083 1.852-3.788 3.2-6.494 3.2-5.495
0-9.782-4.716-9.782-10.736 0-5.979 4.288-10.694 9.782-10.694 2.706 0 5.411
1.348 6.494 3.2V10.38h4.62zm-4.62
13.472v-6.315c-1-1.81-3.206-3.115-5.537-3.115-3.496 0-6.16 2.652-6.16 6.23 0
3.58 2.664 6.273 6.16 6.273 2.29 0 4.537-1.305 5.537-3.073zM87.67
1.497v29.47h-4.62v-2.779c-1.083 1.853-3.789 3.2-6.494 3.2-5.495
0-9.783-4.715-9.783- [...]