Role and Responsibilities
- Develop and automate large-scale, High-Performance, Salable data pipelines (batch and streaming) to drive faster analytics
- Ability to design new Data Architecture with excellent run-time characteristics such as low latency, fault-tolerance, and availability
- Maintain and monitor Real-Time Analytic and Big Data Systems to make sure about their reliability and resolve issues
- Optimize data processing
- Troubleshoot data-related issues
- Collaborate with cross-functional teams
- Development of high-quality database solutions
- Develop, implement and optimize stored procedures and functions using T-SQL
- Review and interpret ongoing business report requirements
- Build appropriate and useful reporting deliverables
- Analyze existing SQL queries for performance improvements
Qualifications and Education Requirements
- Bachelor's or master's degree in computer science, Software Engineering, or a related field.
- Strong programming skills in languages such as Python, Java, and SQL.
- Familiarity with big data technologies such as Apache Hadoop, Apache Spark, and NoSQL databases.
- Experience with data modelling, data architecture, and database design.
- Knowledge of data warehousing concepts and methodologies.
- Excellent problem-solving and troubleshooting skills.
SKILLS
- Ability to Business Process Analysis
- Proven abilities to take initiative and be innovative.
- Analytical mind with a problem-solving attitude
- A keen eye for detail and the ability to spot and fix errors in complex code.
- Ability to perform tasks independently.
- Strong ability for teamwork