Consultant, Data Analytics & Reporting - Ag & Trading
Job Purpose and Impact
The Professional designs, builds, and maintains moderately complex data systems that enable data analysis and reporting. With limited supervision, this job collaborates to ensure that large sets of data are efficiently processed and made accessible for decision-making. This includes designing and implementing scalable data pipelines, optimizing workflows, and ensuring data quality and integrity to support enterprise applications and analytics. The role also involves working with a wide range of tools and technologies including Snowflake, Python, PL/SQL, and Power BI to deliver robust and efficient data solutions.
Key Accountabilities
- DATA & ANALYTICAL SOLUTIONS: Designs and implements scalable and robust data engineering solutions using advanced technologies and cloud platforms. Applies best practices to ensure sustainability and performance of data systems. Demonstrates strong problem-solving abilities and delivers practical solutions in fast-paced environments.
- DATA PIPELINES: Builds and maintains efficient ETL processes and data pipelines for batch and streaming data, enabling seamless ingestion and transformation of large datasets. Proficient in handling big data using tools such as Hadoop, HDFS, MapReduce, and Hive. Also, proficiency in Snowflake tools like Snowpark, SnowPy etc.
- DATA SYSTEMS: Reviews and optimizes backend systems and data architectures to improve performance and reliability of enterprise data solutions. Applies performance optimization techniques to enhance data workflows.
- DATA INFRASTRUCTURE: Prepares and manages infrastructure components to support efficient data storage, retrieval, and processing. Ensures data quality and integrity across systems.
- DATA FORMATS: Implements appropriate data formats and structures to enhance usability and accessibility across analytics and reporting platforms.
- STAKEHOLDER MANAGEMENT: Collaborates with cross-functional teams and business stakeholders to gather requirements and deliver data solutions aligned with business needs. Known for effective communication and teamwork.
- DATA FRAMEWORKS: Develops prototypes and implements data engineering frameworks to support analytics initiatives and improve data processing capabilities. Leads strategic initiatives to enhance data engineering practices.
- AUTOMATED DEPLOYMENT PIPELINES: Implements automated deployment pipelines to streamline code releases and ensure governance and compliance.
- DATA MODELING: Performs data modeling aligned with technologies such as Snowflake and PL/SQL to ensure performance, scalability, and accessibility. Experienced in data design, development, and documentation
Qualifications
- Minimum requirement of 6-8 years of relevant work experience.
LinkedIn Job Matcher
Find where you fit in at Cargill. Log in to connect your LinkedIn profile and we’ll use your skills and experience to search the jobs that might be right for you.
Diversity,
Equity
& Inclusion
Our inclusive culture helps us shape the future of the world.
Life at
Cargill
Discover how you can achieve your higher purpose with a career at Cargill. Learn More