Jonas Monkevičius

Jonas monkevičius

bookmark on deepenrich
Followers of Jonas Monkevičius473 followers
  • Timeline

  • About me

    Data Engineering Tech Lead

  • Education

    • Free university of bozen - bolzano

      2014 - 2014
      Erasmus student in master studies period. computer science master

      Main fields of study: Advanced Data Management Technologies (ETL, DW, Hadoop, Big Data). Our team project is in attachment.

    • Vilniaus universitetas / vilnius university

      2008 - 2012
      Bachelor's degree informatics
    • Vilniaus universitetas

      2013 - 2015
      Master's degree computer modelling

      Master of Science in Computer Modelling

  • Experience

    • Itree lietuva

      Aug 2012 - Jan 2016
      Software developer

      Oracle developer

    • Cern

      Jul 2014 - Oct 2014
      Trainee

      During the internship period the task was to migrate Workflow Management code to modern database and DBI, upgrade CMS software via RPMs and test it in Virtual Machines.

    • Vilnius university

      Apr 2016 - Oct 2016
      Software developer

      Software developer at a few CERN projects: CMS DB Loader application and RESTful API to Oracle databases.

    • Seb

      Oct 2016 - Aug 2023

      Technologies:* Cloud: Google Cloud Platform products and services. Worth to mention: Cloud Functions with Python, BigQuery, Pub Sub, Storage Buckets* IaC: Terraform.* Scheduling: Airflow.* CICD: Jenkins, GitLab (CI/CD Pipelines), Github Actions, Unit tests, Integration tests.* Containerization: OpenShift, Docker, Kubernetes.* Cloudera (ex. Hortonworks): HDFS, Hive, NiFi, Kafka.* Spark: Creating Spark ETL applications with Scala.* Databases: Hive, Oracle, MS SQL, IBM DB2.* Informatica: Big Data Management, Data Integration Hub, Power Center.Short Summary:2016 * Participation in creating first steps of Data Lake in SEB.2017 - 2018 * Storing data in Hadoop ecosystem (HDFS, Hive). * Technologies to do transformations are Informatica products: Big Data Management, Data Integration Hub. * Data sources: Oracle, MS SQL, IBM DB2, Hive, Apache Kafka, MQ, SFTP2017 - 2020 * Using Apache Spark (Scala) for heavier transformations. * Using Apache NiFi for data processing (Kafka, MQ, SFTP for data transportation services as a source) * Using Apache Airflow for Data Orchestration * Jenkins as CICD tool.2021 * Using OpenShift as a platform for hosting applications in containers2020 - 2023 * GCP as cloud service provider * Creating data pipelines with technologies such as: Cloud Functions with Python, BigQuery, Pub Sub, Storage Buckets * Creating reusable GCP components with IaC tool Terraform. * Using OpenShift as a platform, Docker as a containerization tool, Kubernetes as Orchestration * Creating APIs to provide data to customers or use it for inner operations. Show less Technologies:* Cloud: Google Cloud Platform products and services. Worth to mention: Cloud Functions with Python, BigQuery, Pub Sub, Storage Buckets* IaC: Terraform.* Scheduling: Airflow.* CICD: Jenkins, GitLab (CI/CD Pipelines), Github Actions, Unit tests, Integration tests.* Containerization: OpenShift, Docker, Kubernetes.* Cloudera (ex. Hortonworks): HDFS, Hive, NiFi, Kafka.* Spark: Creating Spark ETL applications with Scala.* Databases: Hive, Oracle, MS SQL, IBM DB2.* Informatica: Big Data Management, Data Integration Hub, Power Center.Short Summary:2016 * Participation in creating first steps of Data Lake in SEB.2017 - 2018 * Storing data in Hadoop ecosystem (HDFS, Hive). * Technologies to do transformations are Informatica products: Big Data Management, Data Integration Hub. * Data sources: Oracle, MS SQL, IBM DB2, Hive, Apache Kafka, MQ, SFTP2017 - 2020 * Using Apache Spark (Scala) for heavier transformations. * Using Apache NiFi for data processing (Kafka, MQ, SFTP for data transportation services as a source) * Using Apache Airflow for Data Orchestration * Jenkins as CICD tool.2021 * Using OpenShift as a platform for hosting applications in containers2020 - 2023 * GCP as cloud service provider * Creating data pipelines with technologies such as: Cloud Functions with Python, BigQuery, Pub Sub, Storage Buckets * Creating reusable GCP components with IaC tool Terraform. * Using OpenShift as a platform, Docker as a containerization tool, Kubernetes as Orchestration * Creating APIs to provide data to customers or use it for inner operations. Show less

      • Data Engineering Tech Lead

        Feb 2023 - Aug 2023
      • Senior Data Engineer

        Oct 2016 - Jan 2023
    • Oxylabs.io

      Sept 2023 - now
      • Data Engineering Tech Lead

        Jan 2024 - now
      • Senior Data Engineer

        Sept 2023 - Dec 2023
  • Licenses & Certifications

    • Iz0-146 oracle database 11g: advanced pl/sql [ocp]

      Dec 2013
    • Professional data engineer

      Google cloud
      Jan 2022
      View certificate certificate
    • Professional data engineer

      Google
      Jan 2022
      View certificate certificate
    • Databricks certified associate developer for apache spark 3.0

      Databricks
      Mar 2022
      View certificate certificate