Skip to main content

Di Cargill, kami peduli dengan keselamatan Anda dan menginginkan pengalaman pencarian kerja Anda menjadi pengalaman yang positif. Sayangnya, ada penipuan di luar sana yang berpura-pura menjadi perekrut Cargill untuk mencoba mengumpulkan informasi pribadi atau meminta pembayaran. Perlu diketahui bahwa Cargill tidak akan pernah meminta bayaran dalam proses perekrutan, dan dalam kebanyakan kasus, kami hanya menerima lamaran melalui situs karier resmi kami, kecuali untuk beberapa posisi di pabrik produksi kami. Jika ada proses yang kurang tepat atau Anda memiliki pertanyaan, jangan ragu untuk menghubungi kami. Untuk mempelajari lebih lanjut, kunjungi Pemberitahuan tentang Penawaran Kerja Palsu.

Associate Data Engineer

Apply Now
Job ID 323610 Date posted 04/14/2026 Location : Bengaluru, India Category  Digital Data & Technology Job Status  Salaried Full Time

Job Purpose and Impact

The Associate Data Engineer job assists with the design, building and maintenance of routine data systems that enable data analysis and reporting. Under close supervision, this job provides collaboration to support that large sets of data are efficiently processed and made accessible for decision making.

Key Accountabilities

  • DATA & ANALYTICAL SOLUTIONS: Assists with the development of basic data products and solutions using big data and cloud-based technologies, supporting scalable, sustainable and robust designs.
  • DATA PIPELINES: Collaborates with the development of basic streaming and batch data pipelines that facilitate the seamless ingestion of data from various data sources, transform the data into information and move to data stores like data lake, data warehouse and others.
  • DATA SYSTEMS: Assists with the implementation of existing data systems and architectures in support of improvement and optimization activities.
  • DATA INFRASTRUCTURE: Supports the preparation of data infrastructure aligned with the efficient storage and retrieval of data.
  • DATA FORMATS: Helps implement appropriate data formats to improve data usability and accessibility across the organization.
  • STAKEHOLDER MANAGEMENT: Assembles requirements from multi-functional partners assisting the team to ensure that data solutions meet the functional and non-functional needs of various partners.
  • DATA FRAMEWORKS: Conducts basic testing of new concepts and assists with the implementation of data engineering frameworks and architectures to support the improvement of data processing capabilities and analytics initiatives.
  • AUTOMATED DEPLOYMENT PIPELINES: Collaborates with the implementation of automated deployment pipelines to support improving efficiency of code deployments with fit for purpose governance.
  • DATA MODELING: Performs basic data modeling aligned with the datastore technology to ensure sustainable performance and accessibility.

Qualifications

  • Have a Bachelor's degree with 2 years or more of relevant experience.
  • CLOUD ENVIRONMENTS: Basic familiarity with major cloud platforms (AWS, GCP, Azure) and interest in learning how cloud services support data pipelines and storage.
  • DATA ARCHITECTURE: Introductory understanding of modern data architectures such as data lakes and lakehouses, with exposure to concepts like ingestion, governance, and basic data modeling.
  • DATA INGESTION: Hands-on experience or coursework using data ingestion tools (e.g., Kafka, AWS Glue) and awareness of common data storage formats like Parquet or Iceberg.
  • DATA STREAMING: Foundational understanding of streaming concepts and exposure to tools such as Kafka or Flink.
  • DATA MODELING: Experience writing SQL and supporting data transformation tasks. Familiarity with modeling concepts (e.g., SCDs, schema evolution) and introductory experience with tools like dbt, Airflow, or AWS Glue.
  • DATA TRANSFORMATION: Basic experience using Spark or similar frameworks for data processing, with a willingness to learn more advanced topics like performance tuning and debugging.
  • PROGRAMMING: Proficiency in at least one programming language (typically Python) and ability to write clean, reusable code. Comfortable with SQL basics and working toward stronger query optimization skills.
  • DEVOPS: General awareness of DevOps practices such as version control (Git) and basic CI/CD concepts. Interest in learning deployment and automation workflows.
  • DATA GOVERNANCE: Foundational understanding of data quality, security, and privacy principles. Awareness of best practices for handling data responsibly.

Apply Now

Kisah Kami

Pelajari bagaimana tujuan Kami mendorong semua yang Kami lakukan

Pelajari Lebih Lanjut

Keberagaman, Kesetaraan & Inklusi

Budaya inklusif membantu kami membentuk masa depan dunia.

Pelajari Lebih Lanjut

Laporan Tahunan 2025

Laporan Tahunan Cargill 2025 menyoroti bagaimana kami menghubungkan petani, pekerja makanan, dan pelanggan untuk membantu mentransformasikan makanan dan pertanian serta membangun dunia yang aman pangan.

Pelajari Lebih Lanjut
(2025 Impact Report)

Lihat Semua Peluang Kami yang Tersedia

Thrive