What are the responsibilities and job description for the Senior Software Engineer - Data Infrastructure position at Fintal Partners?
We are a global trading firm powered by a cutting-edge research environment and a world-class technology backbone. For the last 30 years, we’ve been a stabilizing force in financial markets, providing essential liquidity upon which market participants depend. Across our offices in the US, Europe, Asia Pacific, and India, our talented quant researchers, engineers, traders, and business operations professionals are united by our uniquely collaborative, high-performance culture, and our commitment to giving back. From entering dynamic new markets to embracing disruptive technologies, and from developing an innovative research environment to diversifying our trading strategies, we dare to continuously innovate and collaborate to succeed.
We are seeking a dedicated and experienced Software / Data Engineer to join our Chicago team. The ideal candidate is energized by working in a cutting-edge environment that enables us to continue to be at the forefront of the evolving global financial markets.
Core Responsibilities
- Architect, develop and deploy our Big Data environment (Kafka, Hadoop, Dremio, etc.)
- Build, deploy, and monitor our data processing pipelines (Java, Python, Spark, Flink)
- Collaborate with development teams on data modeling, data ingestion, and capacity planning
- Work with users to ensure data integrity and availability
- Act as a Big Data SME and consult on a variety of data-related questions from users and developers
Skills and Experience:
- 5 years experience working in a mature data engineering environment
- 3 years of experience building Kafka streaming applications and/or maintaining Kafka clusters
- 2 years of experience building applications/pipelines with Big Data backends (S3, HDFS, Databricks, Iceberg, etc)
- Experience with Apache Spark, Apache Flink or similar tools
- Strong Java, Python, and SQL development skills
- Experience with common data-science toolkits, especially python-based
- Hands-on experience with Kubernetes and Docker
- Experience with monitoring tools such as Prometheus/Grafana, Alert Manager, Alerta and OpsGenie
- Strong statistical analysis skills
- Demonstrated ability to troubleshoot and conduct root-cause analysis
- Unix scripting experience (bash, python, etc.)