Download PDF

I am:

  • self-motivated and hardworking but only in case if there is something excited ahead is waiting for me
  • willing to solve challenging but meanwhile interesting tasks
  • trying to find new approaches to solve extra hard tasks for which I have no and should not have a competence on board
  • curious and excited to learn some new technologies
  • experimenting with related toolsets to find out if something can be replaced to enhance current situation
  • trying to understand how things work under the hood which becomes very handy in fixing issues

Work experience

2019Till now

Node.JS Developer

Microbilt Ukraine

Aim: find a way to automate web scraping (web data extraction) processes and develop a corresponding application with further maintenance.

Investigation: analyzing all available web automation tools such as Phantom.JS, Casper.JS, Selenium Web Driver, Puppeteer, Nightmare.JS and Electron.JS to determine which one if any can be taken as the basis for the application development.

Resolution: Nightmare.JS automation library was chosen as a basis for the application development since it meets all technical requirements and allows to extend its functionality by developing customized actions based on Electron.JS API.

Responsibilities: developing, testing, deploying and documenting of web scraping (web data extraction) application including its update and maintenance.

20152019

WebQL Developer

Microbilt Ukraine

Web Data Extraction (web scraping) scripts development:

Skills

JavaScript

Always try to dive deeper in details:

  • concurrency model (event loop, event table, event queue, call stack)
  • execution context (lexical environment and scope chain)
  • prototype chain
  • closures

ES6

Node.JS
  • command line tools
  • client-server application
  • inter-process communication
Express
  • router
  • middleware
  • static resources
  • EJS template engine
Electron.JS

Framework for building JavaScript desktop applications:

  • web data extraction application development
  • desktop application development to automate daily routine tasks
Nightmare.JS

High-level automation library based on Electron.JS:

  • using of built in methods to solve variety of automated web scraping tasks
  • using of Electron.JS API to extend built in functionality
XML/XSLT

In some particular cases I prefer to use XML to store data rather SQL or NoSQL. For example, I used XML to store application reference data and XSLT to represent this information to the end user.

Version Control System

I have experience with both systems, GIT and SVN. Have been using GIT along with the GITHub platform during web scraping application development process. Meanwhile SVN have been used during a work with WebQL (web scraping) scripts.

Bash

I use bash scripting to accomplish with settings, preparation, running, etc. required by my applications.

Debian GNU/Linux

The main operating system I prefer to use is Windows, however, I use Debian to test my applications, which are intended to work on Unix-like systems as well as on Windows.

Education

20092014

Master

Kharkiv National University of Radio Electronics

Modelling, development and maintenance of program and information systems.