Deepesh Chandra Dubey

Deepesh Chandra Dubey

Analyst

Followers of Deepesh Chandra Dubey448 followers
location of Deepesh Chandra DubeySouth Weymouth, Massachusetts, United States

Connect with Deepesh Chandra Dubey to Send Message

Connect

Connect with Deepesh Chandra Dubey to Send Message

Connect
  • Timeline

  • About me

    Data Engineer @ Fidelity |ETL, Python, | Airflow | Spark| AWS | Snowflake| Big Data | SRE| DevOps | Django

  • Education

    • Amity University

      2008 - 2012
      Bachelor of Technology (B.Tech.) Bioinformatics
  • Experience

    • BioAxis DNA Research Centre (P) Ltd

      May 2009 - Jun 2009
      Analyst

      --I have been part of team who analyses the DNA sequence for any issue and figure out the required preventive measure.-- Completed internship program on Gene Sequencing Sample data

    • Wipro

      Jun 2012 - Jul 2017

      --Partnered with Development teams on Product Development and application support plans, prototype programs and Bug Fixes. --Level3 support for all production issues with Real time objective. --Handled Disaster Recovery Exercise (DRE) activities and Major app upgrades to ensure smooth failover and application availability --Worked with the command center and global support teams for event resolution and faster recovery process. --Maintained Customer Satisfaction with forward thinking strategies focused on addressing customer needs and resolving concerns. --Responsible for Troubleshooting including providing Root Cause Analysis for Migration Projects which included errors and Issues for Prod and Non Prod Environment. --Experience Shell Scripting and other Oracle tools and utilities like OEM, ADR, SQL Navigator and Toad. --Monitor Database System details within database including Stored Procedures and Execution time and implement efficiency improvements. --Worked closely with systems, applications, databases, and network teams to manage changes to the production environment, equipment, operating systems, and application software and database systems --Participated in post-incident Problem management processes to identify the root causes of issues, and then develop plans to remediate issues. --Perform daily System checks & Monitor application, databases, and business batch processes. --Identify the possible scope of automation activities of the Application process. --Developed and implemented various Automation tools/scripts/utilities in application using Oracle PLSQL and Unix Shell Scripting/Python to reduce the manual intervention in support process. Thus, reducing the overall efforts and streamlining the whole process and improving project delivery Show less --Requirement specification and functional specification given by the client and developing the case study, presenting with the feasible solution --Participate in Requirement specification and functional specification document writing given by the client and developing the case study, presenting with the feasible solution. --omain Knowledge (Banking and Finance) sharing within & across projects includes new members. --Obtain solid understanding of application development process by creating. testing, and deploying software’s on custom hardware’s --Over a period got exposure to various technologies -Oracle PL/SQL, Python, UNIX Shell Scripting, and Cloud infrastructure. Show less

      • Senior Project Engineer

        Jun 2013 - Jul 2017
      • Project Engineer

        Jun 2012 - Jun 2013
    • State Street

      Jul 2017 - May 2020
      Programming Analyst

      --Analyze data load/Feed process with Batch/Batches and Real-time sources through different channel types (SFTP, MQ, NDM etc.) and of different file formats (Delimited, XML, Excel etc.), report generation delays and failure analysis. --Provide support for Enterprise Serving Platform (ESP) Data Warehousing application across several platforms and technologies: Oracle/Exadata, Data Hub Inbound/Outbound (Web services, MQ, SFTP) & Data Service Layer (Physical & Logical Data Marts). --Identification of bugs in Production environment and Provide Root cause for the and pro-actively fixing the production bugs attributed to his in-depth knowledge of the application functionalities and aiming for continuous improvement of the applications. --Data flow analysis to the persistent data marts for each individual instance (Business Units) spread across several servers in the multi thread mart refresh processing. --Perform Stress test periodically to evaluate the application capacity and ensure enough capacity is available to cater the upcoming Data Loads. --Working on APM (Application Performance Monitoring) tools and on-board suitable tool (Geneos/Dynatrace/AMA (In house)) across platform to reduce manual monitoring efforts and reduction in manual misses and manual error --Developing Custom Scripts using Python/Oracle to monitor the components which are not in default package of tool. Application of customized scripts across the board with support rules to raise alarm in case of any deviation observed in application processing. --Developing Business Operation View/Support Operational view for Client relationship managers and End clients to view real time Data flow/Processing on Dashboard. --Participate in regular Agile Scrum meetings to discuss issues in the environment, performance issues and SLA misses and resolve them --Worked on APM tools like (Geneos/Dynatrace/AMA/GOV)) Show less

    • Maxima Consulting, Inc.

      May 2020 - Feb 2024
      Senior Software Developer/ Data Engineer

      --Designed and implemented a real-time data pipeline to process semi-structured data by integrating huge datasets from 10+ data sources using Pyspark, Python, Hadoop. --Automated ETL process across billions of rows of data, which reduced manual workload by high percentage value. --Maintained Data Pipelines up-time of 99% while ingesting, streaming and transactional data across 8 primary Data Sources. --Performed in agile methodology, interacted directly with entire team provided/took feedback on design, Suggested/implemented optimal solutions, and tailored application to meet business requirement and followed Standards --Constructed product-usage SDK data by using PYSPARK, , Spark SQL and Hive context in partitioned Hive external tables maintained in Hadoop for reporting, data science dash boarding and ad-hoc analyses. --Built Data virtualization layer, Data visualization using Tableau and accessed aggregations using impala and hive queries --Ingested data from disparate data sources using a combination of SQL, Microsoft Teams, SharePoint using Python Rest API, Pandas to create data views to be used in BI Tools like Tableau --Used Spark in Python to distribute data processing on large streaming datasets. --Developed Data Modelling units to perform predictive analysis. Show less

    • Fidelity Investments

      Feb 2024 - now
      Business Intelligence Consultant – Data Engineer
  • Licenses & Certifications