Download PDF

Professional Summary

Senior Backend Developer with 5+ years in Python and a strong emphasis on financial services and trading domains. Demonstrated experience building scalable, high-throughput, low-latency systems in AWS. Deep expertise with big data pipelines, data lake architectures, real-time streaming, and robust CI/CD practices. Proven ability to collaborate with data scientists, front-end engineers, and trading experts to deliver production-grade, compliant, and secure solutions for sensitive financial data.

Work experience

Senior Backend Developer — Financial & Trading Systems

Upwork.com

~Design, develop, and maintain backend services for large-scale financial/trading applications, focusing on throughput, latency, and reliability.
~Architect and implement big data pipelines and data lake architectures to ingest, process, and serve complex trading datasets.
~Optimize systems for performance (throughput/latency), resiliency, and cost efficiency in AWS environments.
~Collaborate with front-end engineers, data scientists, and trading experts to deliver robust, end-to-end solutions.
~Enforce security, compliance, and reliability standards for sensitive financial data through secure coding practices and audits.
~Technologies: Python (async, FastAPI), AWS (Lambda, S3, Redshift, Glue, EMR, Kinesis), Docker, Kubernetes, CI/CD, SQL/NoSQL databases

Backend Engineer — Data-Intensive Finance Platform

Upwork.com

~Implemented modular microservices to support real-time pricing, risk calculations, and trade lifecycle management.
~Built and optimized data ingestion from market data feeds, implemented fault-tolerant ETL/ELT pipelines, and maintained data quality and lineage.
~Developed monitoring, alerting, and tracing to ensure system reliability in production.
~Technologies: Python, PostgreSQL, DynamoDB, Redis; Airflow or alternative orchestration; AWS services

Senior Software Engineer — Trading Analytics & ML Integration

Upwork.com

~Collaborated with data scientists to integrate ML models into production pipelines for risk analysis and strategy evaluation.
~Ensured scalable serving of model outputs with low-latency integration into trading workflows.
~Technologies: Python, FastAPI, TensorFlow/PyTorch (conceptual), AWS

Key Projects (Portfolio Highlights)

Real-Time Trade Data Lake & Analytics Platform

~Built an analytics-ready data lake on AWS with Glue for ETL, Redshift as the data warehouse, and Kinesis for streaming ingestion.
~Implemented scalable data schema design, partitioning, and data quality checks; enabled rapid querying for research and decision-making.
~Outcome: Reduced data retrieval time, improved query performance, and streamlined regulatory reporting.

High-Throughput Market Data Ingestion Service

~Designed a fault-tolerant ingestion service handling bursty market data streams; used Kinesis and Lambda for near-real-time processing.
~Implemented backpressure handling, idempotent processing, and robust retry policies; integrated with downstream analytics and risk systems.
~Outcome: Sustained high data integrity and low latency under peak load.

CI/CD for Financial Applications

~Established automated CI/CD pipelines with unit/integration tests, security scans, and blue/green deployment strategies for production stability.
~Tools: Jenkins/GitHub Actions, Terraform/CloudFormation, Docker, Kubernetes
~Outcome: Faster, safer deployments with improved rollback capabilities and compliance traceability.

                        

Education

Master of Science in Computer Engineering

Mar 2017Feb 2020
National University Lviv Polytechnic

Computer Science