Senior Data Engineer

DRW
Chicago
Full-time
We are sorry. The job offer you are looking for is no longer available.

Senior Data Engineer

Job LocationChicagoEmployment typeRegularDepartmentTechnologyTargeted Start DateImmediate

DRW is a diversified trading firm with over 3 decades of experience bringing sophisticated technology and exceptional people together to operate in markets around the world.

We value autonomy and the ability to quickly pivot to capture opportunities, so we operate using our own capital and trading at our own risk.

Headquartered in Chicago with offices throughout the U.S., Canada, Europe, and Asia, we trade a variety of asset classes including Fixed Income, ETFs, Equities, FX, Commodities and Energy across all major global markets.

We have also leveraged our expertise and technology to expand into three non-traditional strategies : real estate, venture capital and cryptoassets.

We operate with respect, curiosity and open minds. The people who thrive here share our belief that it’s not just what we do that matters it's how we do it.

DRW is a place of high expectations, integrity, innovation and a willingness to challenge consensus.

As a Senior Data Engineer on our Unified Platform team, you will play an integral role in designing and building an innovative data platform used by Traders, Quantitative Researchers, and Back-Office personnel to analyze financial markets, determine trading opportunities, establish new strategies, and ensure smooth back-office processes.

Technical requirements summary :

  • Have experience designing and building data-intensive distributed systems
  • Have experience working within the modern batch and streaming data ecosystems
  • An expert in Java / Scala or Python and have experience with SQL and Bash
  • Able to own, organize, and steer team projects
  • Be able to contribute to project management and project reporting
  • Be a leader / mentor to junior members of the team concerning engineering best-practices and code-quality

What you will do in this role :

  • Help design, build, and manage DRW's Unified Data Platform and support its users.
  • Work closely with Traders and Researchers to determine appropriate data sources and implement processes to onboard and manage new data sources for analysis to unlock future trading opportunities.
  • Design and develop data solutions to help discover, purchase, organize, track usage, manage rights, and control quality of data sets to address the needs of various DRW trading teams and strategies.
  • Continually monitor data ingestion pipelines and data quality to ensure stability, reliability, and quality of the data.

Contribute to the monitoring and quality control software and processes.

What you will need in this role :

  • 7+ years of experience working with modern data technologies and / or building data-intensive distributed systems
  • Expert level skills in Java / Scala or Python with a proven ability to output high-quality, maintainable, code
  • Strong familiarity with SQL and Bash
  • Experience leveraging and building cloud-native technologies for scalable data processing.
  • Prior experience with both batch and streaming systems and an understanding of the limitations those paradigms impose
  • Experience with an array of data processing technologies (e.g. Flink, Spark, Polars, Dask, etc.)
  • Experience with an array of data storage technologies (e.g. S3, RDBMS, NoSQL, Delta / Iceberg, Cassandra, Clickhouse, Kafka, etc.)
  • Experience with an array of data formats and serialization systems (e.g. Arrow, Parquet, Protobuf / gRPC, Avro, Thrift, JSON, etc.)
  • Experience managing complex data ETL Pipelines (e.g. Kubernetes, Argo Workflows, Airflow, Prefect, Dagster, etc.)
  • Prior experience dealing with schema governance and schema evolution
  • Prior experience developing data quality control processes to detect data gaps or inaccuracies
  • A desire to mentor less experienced team-members and champion both engineering best-practices and high code-quality standards.
  • Strong technical problem-solving skills
  • Proven ability to work in an agile, fast-paced environment, prioritize multiple tasks and projects, and efficiently handle the demands of a trading environment
  • 7 days ago
Related jobs
Promoted
Fortinet
Chicago, Illinois

Manage various cloud data management systems, including relational databases such as MySQL, time-series datastores such as Influx or similar systems, and in-memory data stores such as Redis. The Cloud Data Storage Engineer is responsible for the design, implementation, and management of FortiMonitor...

Promoted
VirtualVocations
Chicago, Illinois

A company is looking for a Senior Software Engineer, ML Data Pipeline. Key Responsibilities:Collaborate with perception experts and roboticists to scale solutions for L3 driving systemsArchitect and implement state-of-the-art solutions in machine learning domainsDevelop cloud services for data manag...

Tyler Technologies
Chicago, Illinois

Software Engineers here work in teams of 6-10 engineers to build scalable, distributed systems for some of the most important data on the planet. With our Data & Insights solutions, we impact society by enabling governments of all sizes to employ data-driven leadership. Collaborate with other en...

Lextegrity
Chicago, Illinois

POSITION SUMMARY We are seeking a talented Senior Data Engineer to join the data platform group for our Compliance Monitoring product, focused on enabling efficient data to support our risk analytics engine that works to identify fraud, bribery, and corruption. What you’ve done (and skill...

Adyen
Chicago, Illinois

To this end, Adyen is looking for a Senior Software Engineer to join our Data Connect team in Chicago, a person that understands the business context and the data needs behind it, and knows how to implement quality data pipelines on our Big Data Platform. Work with product teams, product specialists...

BDO
Chicago, Illinois

Net, C#, Qlik, Power BI, Machine Learning, Azure Data Factory, RedShift, UiPath, Cloud, RPA, AWS, Redshift, Kinesis, QuickSight, SageMaker, S3, Databricks, AWS Lake Formation, Snowflake, Python, Qlik, Athena, Data Pipeline, Glue, Star Schema, Data Modeling, SQL, SSIS, SSAS, SSRS, PySpark, Microsoft ...

GEICO
Chicago, Illinois
Remote

Advanced experience developing new and enhancing existing data processing (data ingestion, data transformation, data storage, data management, data quality) components. Data processing/data transformation using ETL/ELT tools such as DBT (Data Build Tool), or Databricks. Senior Staff Engineer – Finan...

Akuna Capital
Chicago, Illinois

Build automated data validation test suites that ensure that data is processed and published in accordance with well-defined Service Level Agreements (SLA’s) pertaining to data quality, data availability and data correctness. The Akuna Data Engineering team plays a crucial role in ensuring that trus...

Spline Data
Chicago, Illinois

Spline Data is a focused on building modern quantitative market data feeds and visualizations for fixed income trading desks, portfolio managers, issuers, and beyond. We regard the data we create as if we were trading it ourselves, meaning detail, demonstrability, and robustness are key in every fac...

CIRCLE
Chicago, Illinois

Experience with: Building Docker images and deploying containers in Kubernetes clusters;Any modern CI/CD platform with seemingly complex gates and workflows;Blue-Green, Canary, and A/B Testing deployment strategies;Distributed blockchain systems, running and maintaining blockchain full nodes;Databas...