Malaysia, Remote Posted 3/30/2026 datadotlabs.com 934 Views
Datadotlabs is seeking a passionate and skilled Data Engineer to join their advanced analytics team. This is a fully remote role focused on building scalable data solutions that empower AI and machine learning initiatives. You will be instrumental in driving business growth through the design and implementation of robust data architectures, ensuring that the foundation for advanced analytics is solid and scalable. The ideal candidate will have hands-on experience with modern cloud data warehouses like Snowflake and major cloud providers such as AWS, Azure, or GCP. You will manage complex data lifecycles, from ingestion to transformation, ensuring high-quality data availability for analytics. This role offers the opportunity to work with cutting-edge technologies like PySpark, Airflow, and Kafka in a collaborative Agile environment, where your contributions will directly impact the company's data-driven decision-making process.
Key Requirements
At least 2 years of professional experience working with Snowflake data warehousing.
Proven hands-on experience with Cloud platforms such as AWS, Azure, or GCP.
Strong programming proficiency in Python for data manipulation.
Expertise in writing, optimizing, and debugging complex SQL queries.
Significant experience in designing and maintaining robust ETL pipelines.
Practical knowledge of Big Data tools including PySpark for distributed processing.
Familiarity with workflow orchestration tools like Airflow for task scheduling.
Experience working with real-time streaming platforms such as Kafka.
Demonstrated ability to manage both structured and semi-structured data sets.
Solid understanding of DevOps practices and Agile development methodologies.