Job Summary
We are looking for a highly skilled Senior Data Engineer to design, build, and maintain a robust and scalable data infrastructure. In this role, you will play a critical part in shaping our data ecosystem to support analytics, reporting, and data-driven growth. You will collaborate closely with cross-functional teams, ensuring high-quality, secure, and reliable data pipelines that power strategic decision-making.
Key Responsibilities
- Design and develop data pipelines to integrate, process, and deliver high-quality datasets.
- Implement data orchestration using Apache Airflow and real-time streaming with Kafka.
- Build and maintain scalable, efficient, and secure data workflows.
- Design and optimize data architectures and data warehouses (e.g., PostgreSQL, ClickHouse).
- Define and maintain data models to improve performance of storage and retrieval.
- Ensure data quality, consistency, and freshness through monitoring, validation, and automated checks.
- Document data pipelines, structures, and processes for governance and reusability.
- Collaborate with product, analytics, and technical teams to understand business needs and deliver actionable data solutions.
- Provide structured and reliable datasets for reporting, dashboarding, and decision-making projects.
Key Requirements
- Experience: 2+ years in Data Engineering or a similar technical role.
- Programming: Proficiency in Python for data processing and automation.
- Databases: Expertise in PostgreSQL and strong knowledge of other RDBMS (e.g., SQL Server, MSSQL); familiarity with ClickHouse and NoSQL solutions (e.g., MongoDB, Elasticsearch).
- ETL/ELT: Hands-on experience with Apache Airflow (pipeline orchestration) and SQL-based ETL processes.
- Streaming: Familiarity with Apache Kafka or similar streaming technologies.
- Data Modeling: Solid understanding of data architecture principles (star/snowflake schemas, dimensional modeling).
- DevOps Skills: Familiarity with Docker and CI/CD pipelines for modern data infrastructure deployment.
- BI Tools: Experience with reporting and visualization platforms (e.g., Power BI, Metabase, Kibana).
- Strong troubleshooting, debugging, and problem-solving abilities.
- Excellent communication and collaboration skills to work effectively across teams.
Nice to Have
- Experience with big data frameworks (Apache Spark, Apache Flink).
- Familiarity with version control tools (e.g., Git).
- Knowledge of DAX for Power BI measures.
- Practical experience in monitoring tools and data governance frameworks.
We will offer you:
- Competitive salary
- Insurance and all legal benefits
- Attractive and calm working environment
- Working with a professional team in the field of Banking Solutions
- Performance-based bonuses
- Meal and loan subsidies