Data Engineer III – Data Ventures

Walmart

Position Summary…

What you’ll do…

Do you have boundless energy and passion for engineering data used to solve dynamic problems that will shape the future of retail? With the sheer scale of Walmart’s environment comes the biggest of big data sets. As a Walmart Data Engineer, you will dig into our mammoth scale of data to help unleash the power of retail data science by imagining, developing, and maintaining data pipelines that our Data Scientists and Analysts can rely on. You will be responsible for contributing to an orchestration layer of complex data transformations, refining raw data from source into targeted, valuable data assets for consumption in a governed way. You will partner with Data Scientists, Analysts, other engineers, and business stakeholders to solve complex and exciting challenges so that we can build out capabilities that evolve the retail business model while making a positive impact on our customers’ lives.

MUST HAVE: Hadoop, Spark, Cloud, Python/PySpark and Java, Streaming, Kafka, Backend experience to be considered.

About Team: Data VenturesOur team creates reusable technologies to help with customer acquisition, onboarding, and empowering merchants, while ensuring a seamless experience for both stakeholders. We also optimize tariffs and assortment in accordance with Walmart’s Everyday Low-Cost philosophy. We not only create affordability, but we also deliver customized experiences for customers across all channels – in-store, mobile app, and websites. Our team is responsible for providing support to US Marketplace sellers. We focus on providing immediate solutions to the cases/tickets created by sellers. We interact with multiple teams across the company to provide excellent seller experience.What you’ll do:

  • Data Transformation and Integration: Extracts data from identified databases. Creates data pipelines and transform data to a structure that is relevant to the problem by selecting appropriate techniques. Develops knowledge of current analytics trends.
  • Data Source Identification: Supports the understanding of the priority order of requirements and service level agreements. Helps identify the most suitable source for data that is fit for purpose. Performs initial data quality checks on extracted data.
  • Data Modeling: Analyses complex data elements, systems, data flows, dependencies, and relationships to contribute to conceptual, physical, and logical data models. Develops the Logical Data Model and Physical Data Models including data warehouse and data mart designs. Defines relational tables, primary and foreign keys, and stored procedures to create a data model structure. Evaluates existing data models and physical databases for variances and discrepancies. Develops efficient data flows. Analyses data-related system integration challenges and proposes appropriate solutions.
  • Code Development and Testing: Writes code to develop the required solution and application features by determining the appropriate programming language and leveraging business, technical and data requirements. Creates test cases to review and validate the proposed solution design. Creates proofs of concept. Tests the code using the appropriate testing approach. Deploys software to production servers. Contributes code documentation, maintains playbook, and provides timely progress updates.
  • Applied Business Acumen: Provides recommendations to business stakeholders to solve complex business issues. Develops business cases for projects with a projected return on investment or cost savings. Translates business requirements into projects, activities, and tasks and aligns to overall business strategy. Serves as an interpreter and conduit to connect business needs with tangible solutions and results. Recommends new processes and ways of working.
  • Data Governance: Establishes, modifies, and documents data governance projects and recommendations. Implements data governance practices in partnership with business stakeholders and peers. Interprets company and regulatory policies on data. Educates others on data governance processes, practices, policies, and guidelines. Provides recommendations on needed updates or inputs into data governance policies, practices, or guidelines.
  • Demonstrates up-to-date expertise and applies this to the development, execution, and improvement of action plans by providing expert advice and guidance to others. Supporting and aligning efforts to meet customer and business needs and building commitment for perspectives and rationales.
  • Provides and supports the implementation of business solutions by building relationships and partnerships with key stakeholders. Identifying business needs, determining, and carrying out necessary processes and practices.
  • Promotes and supports company policies, procedures, mission, values, and standards of ethics and integrity by training and providing direction to others in their use and application, ensuring compliance with them.
  • Ensures business needs are being met by evaluating the ongoing effectiveness of current plans, programs, and initiatives. Applying suggestions for improving efficiency and cost effectiveness; and participating in and supporting community outreach events.
  • Creates training documentation and trains end-users on data modeling. Oversees the tasks of less experienced programmers and stipulates system troubleshooting supports.

What you’ll bring:

Must Have

  • Well versed with Hadoop, Spark, Cloud, Python/PySpark and Java, Streaming, Kafka, Backend.
  • You have a proven track record coding with at least one programming language (e.g., Scala [preferred], Python)
  • You’re experienced in one of cloud computing platforms (e.g., GCP, Azure)
  • You’re skilled in data modeling & data migration protocols.
  • Experience with GCP, Data warehousing, BI preferred
  • Experience with the integration tools like Automic, Airflow
  • Experience in building highly scalable Big Data solutions and ETL ecosystems.

Nice to have

  • Knowledge of Databricks is an added advantage.
  • Hands on knowledge in NoSQL like Cosmos DB along with RDBMS like MySQL, Postgres is plus.
  • Hands on working experience in any messaging platform like Kafka is preferred.
  • Increase the efficiency of the team by setting right Processes of Software Development, Requirement Intake, Effort Estimation
  • Demonstrating creative, critical thinking & troubleshooting skills.

About Walmart Global TechImagine working in an environment where one line of code can make life easier for hundreds of millions of people. That’s what we do at Walmart Global Tech. We’re a team of software engineers, data scientists, cybersecurity expert’s and service professionals within the world’s leading retailer who make an epic impact and are at the forefront of the next retail disruption. People are why we innovate, and people power our innovations. We are people-led and tech-empowered. We train our team in the skillsets of the future and bring in experts like you to help us grow. We have roles for those chasing their first opportunity as well as those looking for the opportunity that will define their career. Here, you can kickstart a great career in tech, gain new skills and experience for virtually every industry, or leverage your expertise to innovate at scale, impact millions and reimagine the future of retail.

Flexible, hybrid work:We use a hybrid way of working that is primarily in office coupled with virtual when not onsite. Our campuses serve as a hub to enhance collaboration, bring us together for purpose and deliver on business needs. This approach helps us make quicker decisions, remove location barriers across our global team and be more flexible in our personal lives.

Benefits:Benefits: Beyond our great compensation package, you can receive incentive awards for your performance. Other great perks include 401(k) match, stock purchase plan, paid maternity and parental leave, PTO, multiple health plans, and much more.Equal Opportunity Employer:Walmart, Inc. is an Equal Opportunity Employer – By Choice. We believe we are best equipped to help our associates, customers and the communities we serve live better when we really know them. That means understanding, respecting and valuing diversity- unique styles, experiences, identities, ideas and opinions – while being inclusive of all people.The above information has been designed to indicate the general nature and level of work performed in the role. It is not designed to contain or be interpreted as a comprehensive inventory of all responsibilities and qualifications required of employees assigned to this job. The full Job Description can be made available as part of the hiring process.Please do not add additional Minimum and Preferred Qualification sections.Additonal comments from the hiring manager:

Minimum Qualifications…

Option 1: Bachelor’s degree in Computer Science and 2 years’ experience in software engineering or related field. Option 2: 4 years’ experience insoftware engineering or related field. Option 3: Master’s degree in Computer Science.

Preferred Qualifications…

Outlined below are the optional preferred qualifications for this position. If none are listed, there are no preferred qualifications.

Data engineering, database engineering, business intelligence, or business analytics, Master’s degree in Computer Science or related field and 2 years’ experience in software engineering or related field

Primary Location…

2501 Se J St, Ste A, Bentonville, AR 72716-3724, United States of America

Set up job alerts and get notified about the new jobs

Similar Remote Jobs