Key Responsibilities Product Disposition Solutions: Develop, validate and deploy product-disposition solutions (wormhole) to ensure quality and reduce Defects Per Million (DPM) through a comprehensive product disposition program/workflow. Data Analysis for Yield Improvement and Reliability : Utilize
Key Responsibilities Product Disposition Solutions: Develop, validate and deploy product-disposition solutions (wormhole) to ensure quality and reduce Defects Per Million (DPM) through a comprehensive product disposition program/workflow. Data Analysis for Yield Improvement and Reliability : Utilize
Why choose iZeno? iZeno was founded in 2003 to provide enterprises with custom-built technology solutions they need to keep their business running seamlessly. Logicalis Asia, part of the Logicalis Group, a leading international IT solutions and managed services provider, has acquired a majority stak
Job Description Responsibilities Design, build and deploy data solutions as per business needs Design and secure data storage and transmission Support data related queries and discrepancies Build scalable ETL pipelines and processes Collaborate with cross-functional teams Requirements Relevant work
About the Hiring Team Level Infinite is Tencent’s global gaming brand. It is a global game publisher offering a comprehensive network of services for games, development teams, and studios around the world. We are dedicated to delivering engaging and original gaming experiences to a worldwide audienc
Key Responsibilities Product Disposition Solutions: Develop, validate and deploy product-disposition solutions (wormhole) to ensure quality and reduce Defects Per Million (DPM) through a comprehensive product disposition program/workflow. Data Analysis for Yield Improvement and Reliability : Utilize
Job Description Job Description 1. Perform detailed cooling load calculations and thermal analysis for data halls, including rack-level heat load assessments. 2. Design, review, and optimize ACMV systems, including: - CRAC/CRAH units - Chilled water systems - Cooling towers and pumps 3. Lead and sup
Job Requirements: · Degree in Information Technology or equivalent. · Must have minimum 2-4 years of experience as Data Analyst · Must have hands-on experience in SQL, Athena and BigQuery · Must have experience in A/B Testing methodology and Statistical Analysis · Experience in any Data Visualizatio
We are looking for a skilled and experienced AI Data Engineer to join our team. The ideal candidate will be responsible for designing, building, and maintaining robust data pipelines to support the processing and analysis of clinical study and digital device sensor data. As an AI Data Engineer, you
Our client, a fast-growing digital bank, is seeking a Senior Data Engineer to join their data engineering team. The role will focus on building automation frameworks for data source onboarding and integration, reducing manual effort and enabling the team to scale with the business. The position will
About the Team: You will be part of Sea’s HQ Finance department, which plays an integral role in supporting the accounting and finance needs of our three core businesses (Garena, Shopee, Monee) and across our diverse regional market presence. Within the department, the Group Data function builds and
Our client, a fast-growing digital bank, is seeking a Senior Data Engineer to join their data engineering team. The role will focus on building automation frameworks for data source onboarding and integration, reducing manual effort and enabling the team to scale with the business. The position will
Design, develop, and maintain scalable ETL/ELT pipelines using Databricks. Build and optimize data workflows using Apache Spark (PySpark/Scala). Implement data ingestion from multiple sources (APIs, databases, streaming platforms). Develop and manage data lakes and lakehouse architectures. Work with
Job Title: Python / Data Engineer – Quantitative Investment Support Location: Onsite – Singapore Employment Type: Contract About the Role We are looking for a highly skilled Python/Data Engineer with strong experience in data manipulation, statistics, and machine learning models to support our Quant
Design, develop, and maintain scalable ETL/ELT pipelines using Databricks. Build and optimize data workflows using Apache Spark (PySpark/Scala). Implement data ingestion from multiple sources (APIs, databases, streaming platforms). Develop and manage data lakes and lakehouse architectures. Work with
Experience and passion for data engineering in a big data environment using Cloud platforms - AWS, GCP or Azure ï‚· Experience with building production-grade data pipelines, ETL/ELT data integration ï‚· Interested in being the bridge between engineering and analytics ï‚· Knowledgeable about system d
About Us Join us at Orange Business! We are a network and digital integrator that understands the entire value chain of the digital world, freeing our customers to focus on the strategic initiatives that shape their business. Every day, you will collaborate with a team dedicated to providing consist
Key Responsibilities Product Disposition Solutions: Develop, validate and deploy product-disposition solutions (wormhole) to ensure quality and reduce Defects Per Million (DPM) through a comprehensive product disposition program/workflow. Data Analysis for Yield Improvement and Reliability : Utilize