Staff Data Engineer

Twilio

See yourself at Twilio

Join the team as our next Staff Data Engineer on Twilio’s Segment product team.

Who we are & why we’re hiring

Twilio powers real-time business communications and data solutions that help companies and developers worldwide build better applications and customer experiences.

Although we’re headquartered in San Francisco, we have presence throughout South America, Europe, Asia and Australia. We’re on a journey to becoming a global company that actively opposes racism and all forms of oppression and bias. At Twilio, we support diversity, equity & inclusion wherever we do business.

About the job

The Data Engineering team at Twilio-Segment is the backbone of all data-driven decisions we make to move the business forward. We are seeking a highly skilled data engineer to join our team and help drive our development process.

As a staff software engineer, you will partner with business stakeholders across the organization to identify pain points, gather requirements, and extract value out of our data. You will be responsible for designing, building, and maintaining pipelines to process terabyte scale datasets using both batch and streaming process techniques.

You will also help optimize the design of our data warehouse as well as help teams build data-driven processes and automation on top of it.

Responsibilities

In this role, you’ll:

  • Design, build, and maintain data pipelines that collect, process, and transform large volumes of data from various sources into a format suitable for analysis.
  • Develop and maintain our data warehouse (Snowflake) to enable efficient and accurate analysis of data.
  • Document data pipelines, data models, and data transformation processes.
  • Collaborate with cross-functional teams to identify and understand data requirements for various business needs.
  • Work with data scientists to build our internal machine learning infrastructure.

Qualifications
Not all applicants will have skills that match a job description exactly. Twilio values diverse experiences in other industries, and we encourage everyone who meets the required qualifications to apply. While having “desired” qualifications make for a strong candidate, we encourage applicants with alternative experiences to also apply. If your career is just starting or hasn’t followed a traditional path, don’t let that stop you from considering Twilio. We are always looking for people who will bring something new to the table!

Required:

  • 7+ years of experience in data engineering or related fields, with a strong focus on designing and building scalable data systems.
  • Experience in designing scalable data warehouses and working with modern data warehousing solutions, such as Snowflake.
  • Experience with data orchestration tools like Airflow and dbt, with a solid understanding of data modeling and ETL principles.
  • Experience with infrastructure-as-code tools (e.g., Terraform) and modern CI/CD pipelines
  • Proven track record of delivering large-scale data projects and working in cross-functional teams.
  • Self starter, ability to work independently and autonomously, as well as part of a team.

Desired:

  • Experience on building large scale distributed systems in AWS.
  • Experience with Python, Go, or/and Java.
  • Experience with streaming technology stack, such as Kafka or Kinesis.
  • Experience with managing and deploying machine learning models.

Set up job alerts and get notified about the new jobs

Similar Remote Jobs