Market Data Engineer - Data Engineering & PCAP Normalisation

Directors
  • Sydney
  • Permanent
  • Permanent IT
  • CTd9-1636787
  • 09/12/2025
Market Data Engineer – Data Engineering & PCAP Normalisation
Remote (Global Hedge Fund – Quant/Trading Infrastructure Team)
  
Our client, a global hedge fund, is seeking a talented Market Data Engineer to join its Data Engineering team, focusing on the development and maintenance of infrastructure that processes and normalises market data. The successful candidate will work with high-frequency raw market feeds, including packet-capture (PCAP) data and exchange feeds, and standardise them into a unified format to support trading, quantitative research, risk, and analytics functions.
  
Key Responsibilities
  • Build and manage pipelines to process real-time and historical market data from exchanges, PCAPs, and external vendors.
  • Develop data normalisation logic to standardise formats, units, and timestamps across all datasets.
  • Architect, build, and maintain scalable storage and retrieval systems, including time-series databases and data lakes.
  • Optimise data ingestion for low-latency, high-performance trading environments.
  • Collaborate with traders, quants, and risk teams to ensure data readiness for live trading and back-testing.
  • Implement and enforce data governance practices to ensure data integrity and compliance.
  • Continuously enhance data infrastructure, tools, and workflows.
Required Skills & Qualifications
  • Bachelor’s degree or higher (MSc/PhD preferred) in Computer Science, Engineering, Mathematics, Physics, or a related quantitative discipline.
  • Strong programming skills in Python, Java, C++, or similar languages.
  • Extensive experience designing and optimising large-scale data pipelines, including real-time streaming and batch workflows.
  • Strong familiarity with financial markets data, including exchange feeds, historical tick data, PCAP feeds, and vendor data sources.
  • Proficiency with relational and/or time-series databases, data storage systems, data modelling, and data retrieval.
  • Solid understanding of data normalisation, data cleansing, data quality controls, schema design, and metadata management.
  • Ability to write robust, production-grade, well-tested, and maintainable code; experience with Unix/Linux environments, version control, and deployment best practices.
  • Strong analytical and problem-solving abilities, capable of handling edge cases, detecting anomalies, and ensuring data integrity.
  • Effective communication skills, with the ability to collaborate with quantitative researchers, traders, and risk/ops stakeholders.
Preferred Experience
  • Previous experience in a hedge fund, proprietary trading firm, or systematic trading environment.
  • Familiarity with real-time data distribution systems and PCAP data ingestion.
  • Knowledge of streaming technologies (e.g., Kafka, Kinesis) and distributed processing frameworks (e.g., Spark).
  • Experience with data governance, lineage tracking, and regulatory compliance.
Candidates with a passion for high-performance data infrastructure will thrive in this role, contributing to mission-critical systems that power global, multi-asset trading strategies at scale.
  
Candidates must have full Australian working rights.

Certus Recruitment Group is a well-established and experienced specialist consultancy providing sales, marketing, and IT recruitment services. We serve the technology sector across Australia, North America, the UK, and Europe.
Contact us