نشان کن
کد آگهی: KP4436662190

Senior Data Engineer

کارگزاری مفید- سبدگردان مفید- مشاوره سرمایه گذاری ایده مفید- پردازش اطلاعات مالی مفید
در تهران - جردن
در وبسایت جاب ویژن  (3 هفته پیش)
اطلاعات شغل:
نوع همکاری:  تمام وقت
مهارت‌های مورد نیاز:
Python
ساعت کاری:  Saturday to Wednesday , 07:30 am to 16:30 pm
متن کامل آگهی:

We are seeking a Data Engineer with strong programming foundations—especially in Python—to design, build, and operate reliable data pipelines and analytics infrastructure. You will work closely with analytics teams to transform raw data into trusted, high‑quality datasets and performance‑optimized tables in ClickHouse and PostgresSQL, orchestrated via Apache Airflow.

What You’ll Do

  • Design, build, and maintain scalable ETL/ELT pipelines in Python, orchestrated with Airflow (DAGs, operators, sensors, schedules, SLAs, backfills).
  • Contribute to platform improvements (frameworks, libraries, and best practices) to accelerate data delivery.
  • Model, load, and optimize datasets in ClickHouse and PostgreSQL (partitioning, indexing, materialized views where appropriate, query performance tuning).
  • Implement data quality checks, validation, and monitoring (e.g., unit/integration tests for pipelines, schema/contracts, alerts).
  • Own pipeline reliability and operability: logging, observability, alerting, failure recovery, and cost/performance tuning.
  • Manage and improve CI/CD for data workflows, including code reviews, automated testing, and deployment.

Required Qualifications:

  • 3+ years of experience as a Data Engineer (or similar role) building production‑grade data pipelines.
  • Strong Python expertise: clean code, packaging, type hints, testing (pytest), and performance optimization; familiarity with Django or FastAPI is a plus.
  • Hands‑on experience with Apache Airflow: Authoring TaskFlow‑style DAGs, custom operators/sensors, scheduling, retries, backfills, SLAs. Operational familiarity: UI monitoring, logs, pools/queues, variable/connection management.
  • Proficiency with ClickHouse: Table engines, partitioning, primary keys, compression. Query optimization, materialized views, distributed tables, ingestion strategies.
  • Strong SQL skills and PostgreSQL experience: Schema design and normalization/denormalization, indexing, query analysis, performance tuning.
  • Experience with version control (Git) and CI/CD workflows.
  • Comfort with Linux, shell scripting, and basic networking concepts.

Additional Qualifications:

  • Experience with streaming ecosystems (Kafka), CDC, or near‑real‑time pipelines.
  • Experience with Apache Spark for distributed data processing.
  • Familiarity with dbt, Great Expectations, or similar tooling for transformations and data quality.
  • Containerization with Docker; Kubernetes is a plus.
  • Experience with Jenkins for CI/CD pipeline orchestration.
  • Monitoring/observability stacks (Prometheus/Grafana, ELK/OpenSearch) applied to data pipelines.

این آگهی از وبسایت جاب ویژن پیدا شده، با زدن دکمه‌ی تماس با کارفرما، به وبسایت جاب ویژن برین و از اون‌جا برای این شغل اقدام کنین.

هشدار
توجه داشته باشید که دریافت هزینه از کارجو برای استخدام با هر عنوانی غیرقانونی است. در صورت مواجهه با موارد مشکوک،‌ با کلیک بر روی «گزارش مشکل آگهی» به ما در پیگیری تخلفات کمک کنید.
گزارش مشکل آگهی
تماس با کارفرما
این آگهی رو برای دیگران بفرست
نشان کن
گزارش مشکل آگهی
سه‌شنبه 8 بهمن 1404، ساعت 01:18