Expired
Top 3 reasons to join us
- Attractive Salary and Performance Bonus
- Health Care for Employees & Family, 18 paid leaves
- Chance to travel onsite (in 53 countries)
Job description
You will be responsible for building and running the data processing pipeline on Google Cloud platform
- Work with implementation teams from concept to operations, providing deep technical expertise for successfully deploying large scale data solutions in the enterprise, using modern data/analytics technologies on GCP
- Work with data team to efficiently use GCP to analyze data, build data models, and generate reports/visualizations
- Integrate massive datasets from multiple data sources for data modelling
- Implement methods for Devops automation of all parts of the build data pipelines to deploy from development to production
- Formulate business problems as technical data problems while ensuring key business drivers are captured in collaboration with product management
- Design pipelines and architectures for data processing
- Extract, load, transform, clean and validate data
Your skills and experience
Must requirements:
- At least 2 years of experience in ETL Tools (such as Infromatica, Apache Beam, Kafka)
- At least 1 years of experience in Google Cloud Platform (BigQuery, DataProc, DataFlow)
- Having relative experience in any scripting/programming language as following: Java/Python/Go
- Having relative experience in declarative CI/CD (Jenkins, Azure DevOps)
- Having relative experience in Databases: SQL and NoSQL
- Have a strong engineering mindset to automate tasks, identify use cases, test cases, improve the system, PR/Incident resolution and deployments
Good to Have:
- Having relative experience in Automation: Kubernetes or Docker & Containerization
- Having relative experience in Infrastructure as a Code (IaaC) (i.e., Terraform, Cloud Formation, Azure ARM Templates)
- Having knowledge or experience in Big Data - Hadoop ecosystems including HDFS, MapReduce, YARN, HBase, Zookeeper, Spark, Pig, Hive...
- Having knowledge or experience with Hadoop distributions such as Cloudera, HortonWorks...
- Having relative experience in Data management:
- Data Governance
- Data Architecture
- Data Modelling
- Data Quality
- Data integration
Why you'll love working here
- Salary up to 4000$ and Joining Bonus up to 30m
- Attractive package including base salary + 13th month salary + Performance Bonus
- Insurance based on full base salary
- Meal allowance 730,000/month
- 100% of full salary and benefits as an official employee from the 1st day of working
- Medical Benefit (Bao Viet Insurance Package) for Employee and Family
- Working in a fast paced, flexible, and multinational working environment with opportunity to travel
onsite (in 49 countries) - Internal Training (Technical & Functional & English)
- Working time: 8:30 am - 6:00 pm from Mondays to Fridays
HCL Vietnam Company Limited
View company
We bring together the best of technology and our people to supercharge progress.
Company type
IT Outsourcing
Company size
501-1000
employees
Country
India
Working days
Monday - Friday
Overtime policy
No OT
More jobs for you
Get similar jobs by email
NEW FOR YOU
Posted
11 hours ago
Middle/Senior DevOps (Cloud, AWS Engineer) up to $3000
At office
Ha Noi