Madhu Babu Avula

Madhu Babu Avula

Followers of Madhu Babu Avula16000 followers
location of Madhu Babu AvulaPlano, Texas, United States

Connect with Madhu Babu Avula to Send Message

Connect

Connect with Madhu Babu Avula to Send Message

Connect
  • Timeline

  • About me

    Lead Data Engineer | GCP | Azure | SAFe Agile | Project Management

  • Education

    • MVH

      1999 - 2000
      School of secondary education PHYSICAL SCIENCES 8.1
    • Sri Venkateswara University

      2004 - 2007
      Bachelor’s Degree Computer Science Distinction
    • Dravidian University, Kuppam, Chittoor District

      2010 - 2012
      Master's degree Business Administration and Management, General 6.0
    • Sri Chandra Reddy junior college

      2000 - 2002
      Intermediate Mathematics 7.6
  • Experience

    • Cognizant

      Dec 2007 - Feb 2011

      Project 5 Citi asset allocation and settlement system (CAASI) Environment Informatica 9.0, Cognos 10.0, Teradata,Control-M and UNIXClient Citi bank Asset wealth managementDesignation Sr DeveloperDuration Feb 2011 to August 2012Team Size 10 Roles & Responsibility: ● Building Test Estimates and MTP (Master Test Plan) based on requirement gathering, analyzing project with BA, architects and business partners. ● Providing walkthrough to business partners for artifacts and deliverables built by QA. ● Have ability to convert business discussions and technical documents into requirement documents, test artifacts, test scenarios, test cases, test scripts and appropriate test data. ● All aggregations and calculations performed at different levels are validated. ● Check for conditional formatting rules, alerting rules and boundary value analysis approach. ● Validation on static and dynamic data changes within dashboard and reports. ● Create and use metadata for objects – DB tables, procedures, views, flat files, JSAN files.● Validation of in-memory data against reports and database objects.● Test chart/ component for different data types like decimal values, very large values, negative values, null values, zero values. Show less

      • Support Engineer

        Mar 2010 - Feb 2011
      • Support trainee

        Dec 2007 - Mar 2010
    • Capgemini

      Feb 2011 - Aug 2012
      ETL Analyst

      Project 6 Invest management system Environment Abinitio 5.0, Cognos 10.0, Control-M and UNIXClient Credit Suisse Investment bankingDesignation Programmer AnalystDuration June 2008 - Jan 2011Team Size 10 Role and Responsibilities: ● Smoke Test ~Sanity Test ~ DDL Check● Executing Abinitio Graphs and loading database tables as per dependencies.● Perform count and data validation between source (upstream) and target (downstream)after applying column & table level transformations.● Validate DWH concepts over dimension, fact and lookup tables of its type.● Ensure relationships in data model, mapping document and table are in sync.● Validate Business scenarios, positive and negative scenarios for initial and incremental data load Show less

    • Cognizant

      Aug 2012 - Mar 2017

      Project Scope: The SEI Wealth Platform offers a fully integrated, front-to-back experience for practitioners in the wealth management space. The goal of the platform is to deliver the value of an integrated systems environment with embedded best practice workflows, thus allowing firms to then focus their own efforts on unique market differentiators instead of systems integration. The objective of this project to track the investor’s Portfolio management, Asset transfers, Cash flow ,loans and other account related information to generate the reports using kibana.Role and Responsibilities: ● Designing, Development, Technical reviews, testing and documentation.● Participated in designing the architecture discussion of the project with BA and design teams.● Developing Sqoop jobs with incremental load from heterogeneous RDBMS (SQL server) using native dB connectors ● Designed Hive repository with external tables, internal tables, buckets, partitions , and ORC compressions for incremental data load of parsed data for analytical & operational dashboards● Integrated the Hive- spark using spark SQL to pull the data from Hive metastore to spark● Based on the business requirements designed RDD and Data frames to implement Business logic with Pyspark● Export the data into mysql using sqoop Export● Developed scripts to generate the OLAP reports using the Kibana visualization tool● Responsible to handle support issues posted by customer and issues posted by QA team ● Debugging using spark shell and understand the issues and provide solution for the issue and responsible for monitoring Oozie job scheduler.● Release notes preparation and verification for SIT/UAT and production.● Regression testing verification as part of pre-production activities. Show less Project Scope:Mortgage banking application to deal with Auto loans, student loans origination, sanction and end to end cycle, From BDW architecture to ICDW model .Built staging, integration, Symentaic and Reporting layers using the DW tool informatica and building the OLAP reports using cognos, Teradata used as DatabaseRoles & Responsibility: ● Developed and tested extraction, transformation, and load (ETL) processes.● Designed and developed Informatica's Mappings and Sessions based on business user requirements and business rules to load data from source flat files and oracle tables to target tables.● Designed and implemented stored procedures, views and other application database code objects to aid complex mappings.Capgemini IndiaProject 5 Citi asset allocation and settlement system (CAASI) Environment Informatica 9.0, Cognos 10.0, Teradata,Control-M and UNIXClient Citi bank Asset wealth managementDesignation Sr DeveloperDuration Feb 2011 to August 2012Team Size 10 Roles & Responsibility: ● Building Test Estimates and MTP (Master Test Plan) based on requirement gathering, analyzing project with BA, architects and business partners. ● Providing walkthrough to business partners for artifacts and deliverables built by QA. ● Have ability to convert business discussions and technical documents into requirement documents, test artifacts, test scenarios, test cases, test scripts and appropriate test data. ● All aggregations and calculations performed at different levels are validated. ● Check for conditional formatting rules, alerting rules and boundary value analysis approach. ● Validation on static and dynamic data changes within dashboard and reports. ● Create and use metadata for objects – DB tables, procedures, views, flat files, JSAN files.● Validation of in-memory data against reports and database objects. Show less

      • Tech Lead

        Feb 2015 - Mar 2017
      • senior ETL analyst

        Aug 2012 - Jan 2015
    • Hexaware Technologies

      Aug 2017 - Mar 2019
      Big Data Developer

      Project Scope:Integrating the 3 different applications (PPT, TCO and SPH) as one unified application SPA to use to calculate the penetration on sales different regions across the globe.Involved on OLAP data migration from DW to HDFS space and applied business logic using Hive and generated the sales reports to view the penetration percentage from different regions across the globe using Tableau reporting toolRole and Responsibilities: ● Gathered the business requirements from the Business Partners and Subject Matter Experts.● Used Sqoop extensively to ingest data from various source systems like Oracle, SQL server into HDFS.● Worked on a live 20 nodes Hadoop cluster running on Cloudera ● Developed Hive Queries for user requirements to perform ad-hoc analysis.● Worked on the performance issues by fine tuning the HiveSql scripts ● Implemented functionality based data modeling on Hive Tables and stored the resultantsRecord sets into HBASE via Hbase storage Handler Column mapping● Used Oozie Scheduler system to automate the pipeline workflow and orchestrate the map reduces jobs that extract the data on a timely manner.● Used Zookeeper for providing coordinating services to the cluster.● Actively participated in software development lifecycle (scope, design, implement, deploy, test), including design and code reviews, test development, test automation Show less

    • SEI

      Mar 2019 - Dec 2020
      Sr. Big Data Developer

      Project Scope: The SEI Wealth Platform offers a fully integrated, front-to-back experience for practitioners in the wealth management space. The goal of the platform is to deliver the value of an integrated systems environment with embedded best practice workflows, thus allowing firms to then focus their own efforts on unique market differentiators instead of systems integration. The objective of this project to track the investor’s Portfolio management, Asset transfers, Cash flow ,loans and other account related information to generate the reports using kibana.Role and Responsibilities: ● Designing, Development, Technical reviews, testing and documentation.● Participated in designing the architecture discussion of the project with BA and design teams.● Developing Sqoop jobs with incremental load from heterogeneous RDBMS (SQL server) using native dB connectors ● Designed Hive repository with External tables, Managed tables, buckets, partitions , and ORC compressions for incremental data load of parsed data for analytical & operational dashboards● Integrated the Hive- spark using spark SQL to pull the data from Hive metastore to spark● Based on the business requirements applied the transformation logic using PySpark(RDD and Data frames)● Export the data into mysql using Sqoop Export● Developed scripts to generate the OLAP reports using the Kibana visualization tool● Responsible to handle support issues posted by customer and issues posted by QA team ● Debugging using spark shell and understand the issues and provide solution for the issue and responsible for monitoring Oozie job scheduler.● Release notes preparation and verification for SIT/UAT and production.● Testing source to target data mapping.● Performing BI testing All aggregations and calculations performed at different levels are validated. ● Check for conditional formatting rules, alerting rules and boundary value analysis approach. Show less

    • Standard Chartered Bank

      Jan 2021 - Mar 2021
      Senior Developer
    • EPAM Systems

      Apr 2021 - Dec 2021
      Senior Data Engineer
    • Tata Consultancy Services

      Dec 2021 - now
      • Tech lead

        May 2023 - now
      • Technical Lead

        Dec 2021 - May 2023
  • Licenses & Certifications

    • SAFe Agilist

      Scaled Agile, Inc.
      Nov 2018
    • PRINCE2® foundation

      AXELOS Global Best Practice
      Mar 2018
    • Amazon Web Services Solutions Architect Associate

      Amazon Web Services (AWS)
      Jun 2020
    • UI path foundation level

      UiPath
    • Tableau 10 for Data Science

      Udemy
      Apr 2020
      View certificate certificate
    • Exam Prep: Microsoft Azure Fundamentals (AZ-900)

      LinkedIn
      Jul 2020
      View certificate certificate
    • Istqb foundation level

      ISTQB - International Software Testing Qualifications Board
      Mar 2011
    • PRINCE2®Practitioner

      AXELOS Global Best Practice
      Mar 2018
    • Certified Scrum Product Owner (CSPO)

      Scrum Alliance
      May 2020
    • Certified Scrum Master(CSM)

      Scrum Alliance
      May 2018
      View certificate certificate
  • Honors & Awards

    • Awarded to Madhu Babu Avula
      Best Employee Award Google Mar 2014
  • Volunteer Experience

    • Academy Trainer

      Issued by Cognizant on May 2013
      CognizantAssociated with Madhu Babu Avula