- Permanent
- Anywhere
Role Responsibilities
- Design, implement, and manage scalable data pipelines and ETL processes.
- Collaborate with data analysts and software engineers to optimize data workflows.
- Develop and maintain data models supporting analytics and reporting needs.
- Ensure data integrity, quality, and security across various systems.
- Troubleshoot and resolve data-related issues efficiently.
Skills
- Proficiency in SQL, Python, and cloud-based data platforms (e.g., AWS, Azure, or GCP).
- Experience with data pipeline tools such as Airflow, dbt, or Kafka.
- Strong problem-solving skills and a collaborative mindset.
- Ability to work effectively in a hybrid setup (office & remote).
Offer
- A hybrid work model offering flexibility and innovation.
- A collaborative and forward-thinking work culture.
- Opportunities for personal and professional growth.
- Competitive, negotiable salary package.
