Posted 12 Apr 2024, 1:00 am

Data Engineer at OfferUp

Sorry, but this job listing has expired!

About the role:

We are looking for an experienced Data Engineer to work on building, operating, and scaling OfferUp’s batch and near real-time data processing platforms and tools that will power data-driven capabilities throughout the entire organization. This is a highly visible opportunity to build systems that drive the end-user experience, as well as directly supporting backend engineers, business intelligence dashboards and reports, data analysts and data scientists.  Building the largest and most responsive mobile marketplace poses unique data challenges that require leveraging the latest developments in data infrastructure. We leverage open-source infrastructure where we can but are ready to build and share solutions if they don’t exist yet. Come help us do that!

This role is only available outside the US and not available to any individual based within the US or any US territory.

Here’s more of what you will get to do:

  • Design and develop applications to process large amounts of critical information to power analytics and user-facing features.
  • Monitor and resolve data pipeline or data integrity issues.
  • Work across multiple teams to understand their data needs.
  • Maintain and expand our data infrastructure.

You’ll thrive in this role if you have:

  • 3+ years of professional software development experience
  • Strong ability in distributed systems for processing large scale data processing
  • Ability to communicate technical information effectively to technical and non-technical audiences
  • Proficiency in SQL and Python
  • Experience leveraging open source data infrastructure projects, such as Airflow, Kafka, Avro, Parquet, Hadoop, Hive, HBase, Presto or Druid
  • Experience building scalable data pipelines and real-time data streams
  • Experience building software in AWS or a similar cloud environment
  • Experience with AWS services like Kinesis, Firehose, Lambda, Sagemaker is a big plus
  • Experience with GCP services like BigQuery, Cloud Functions is a big plus
  • Experience with operational tools like Terraform, Datadog and Pagerduty is a big plus
  • Computer Science or Engineering degree required, Masters degree preferred
  • Excellent communication skills, both written and spoken (fluency in English required)

Please mention the word **ACCOMODATIVE** and tag RMTguMjQ2LjE2Ljk4 when applying to show you read the job post completely (#RMTguMjQ2LjE2Ljk4). This is a beta feature to avoid spam applicants. Companies can search these words to find applicants that read this and see they're human.

The offering company is responsible for the content on this page / the job offer.
Source: Remote Ok