This job has been added to your Saved jobs.
You have reached the limit of 20 Saved Jobs. If you want to create a new one, please manage your Saved Jobs.
Posted
3 hours ago
Skills:
Job Expertise:
Job Domain:
IT Services and IT Consulting
Top 3 reasons to join us
- Gain very attractive salary and benefits;
- Be a developer, not a coder;
- Excellent environment and team to help you grow.
Job description
Duration: Full-time for 6 months
The Role
- We are looking for a pragmatic and self-driven contract Data Engineer to help accelerate our data roadmap. Your main focus will be to expand our data ecosystem by integrating new data sources and developing robust data models, directly supporting our self-service analytics goals.
- You will take ownership of our Airflow DAGs and dbt models, balancing the development of new pipelines with the maintenance of critical existing infrastructure.
- This is a hands-on role for someone who thrives on delivering business impact. While you’ll collaborate with senior engineers on high-level architecture for new data sources, you’ll have full ownership of implementation, operations, and the success of your projects.
Responsibilities
- Expand Data Coverage: Proactively own the end-to-end process of ingesting new data sources and building scalable pipelines using Airflow and dbt.
- Partner with Analysts: Collaborate closely with our data analysts to understand business requirements, define metrics, and develop the specific, reliable data models they need for their dashboards and analyses in Superset.
- Deliver Pragmatic Solutions: Consistently make pragmatic technical decisions that prioritize business value and speed of delivery, in line with our early-stage startup environment.
- Operational Excellence: Own the day-to-day health of the data platform. This includes monitoring pipelines, debugging failures, and helping to establish and maintain data quality SLAs.
- Cross-functional Collaboration: Work with other product and engineering teams to understand source systems and ensure seamless data extraction.
- Refactor and Improve: Identify opportunities to improve and refactor existing ingestion and transformation logic for better performance and maintainability.
Your skills and experience
- 5+ years of dedicated experience as a Data Engineer.
- Expertise in Apache Airflow: Proven experience developing, deploying, and owning complex data pipelines. You can work independently to build and debug intricate DAGs.
- Deep Expertise in dbt: Strong proficiency in building modular, testable, and maintainable data models with dbt.
- Pragmatic Problem-Solving: A demonstrated ability to choose the right solution for the problem at hand, avoiding over-engineering while ensuring robustness.
- Business Acumen: Experience translating ambiguous business or product requirements into concrete technical data solutions. You are comfortable asking “why” to understand the core business driver.
- Expert-level SQL and Strong Python: Essential for all aspects of the role.
- Data Warehousing Fundamentals: Solid understanding of dimensional modeling and ETL/ELT best practices.
Nice-to-Haves
- AWS Experience: Familiarity with core AWS services used in a data context (Aurora RDS, S3, IAM).
- Experience in a Startup Environment: Comfortable with ambiguity and a fast-paced setting.
- BI Tool Support: Experience working closely with users of BI tools like Superset, Metabase, or Tableau.
Why you'll love working here
- Competitive salary;
- Mon-Fri, flexible remote working;
- English, professional working environment.
More jobs for you
Get similar jobs by email
NEW FOR YOU
Posted
15 hours ago
Senior Data Engineer (Snowflake/Databricks) up to $2200
At office
Ho Chi Minh - Ha Noi
Feedback