Karthik Ganapathy

Karthik Ganapathy

Project Lead

Followers of Karthik Ganapathy169 followers
location of Karthik GanapathyChennai, Tamil Nadu, India

Connect with Karthik Ganapathy to Send Message

Connect

Connect with Karthik Ganapathy to Send Message

Connect
  • Timeline

  • About me

    Data Architect - Technical Advisor at Fiserv

  • Education

    • Bharathidasan University

      1998 - 2002
      Bachelor of Engineering Computer Science
  • Experience

    • HCL Technologies

      Oct 2003 - Jun 2011
      Project Lead

      Analyzing the requirements and making functional specification by discussing with business user groups and translate the Business Requirements. Documented source-to-target mappings and ETL specificationsLead the development team and co-ordinate the projects among developersCreated and maintained Database Objects (Tables, Views, Indexes, etc)Well versed in constructing DB2 SQL queries based on business requirementsAnalyse ETL system impact and designed ETL processes for optimal performance Develop Informatica mappings and workflows for extracts and feeds. Data is extracted through mappings from different sources like Relational database tables, flat / VSAM files, etc.Developed Workflows using task developer, worklet designer, and workflow designer in Workflow manager and monitored the results using workflow monitorCreated Parameter files to pass database connections, parameter entries for source and targetApplied slowly changing dimensions like Type 1 and 2 effectively to handle the delta LoadsWorked on various lookup caches like static, dynamic and persistent.Used the debugger in Informatica to test the mapping and fix the mappings.Worked on command tasks, event wait tasks, event raise tasks, timer tasks to implement business logic.Implemented Target Load Plan and Pre Session and Post Session Scripts.Modified several of the existing mappings based on the user requirements and maintained existing mappings, sessions and workflowsCreated Test cases for the mappings developed for verifying the results by testersTrained on building Conceptual, Logical and Physical data model using Erwin Data ModelerResponsible for creating data models by interacting with internal data architect, business analysts, and business users to ensure it is in-line with business requirementsAnalysis of Different data types (Data Analysis) and their sources to make sure the created data model is extendable for future release Show less

    • Tata Consultancy Services

      Jul 2011 - Oct 2015
      Associate Consultant

      Key person to drive a huge Migration project and collabrated with a team of developers to deliver the resultsAchieved 95% of straight through processing and migrated data from old to new system with good performanceSelf driven and takes pro-active steps to correct data with anomalies like duplicates, inaccurate and irrelevant feedsStudy existing business processes and organizational procedures and transformed them to technical solutionsDevelop Informatica mappings and workflows for extracts and feeds. Data is extracted through mappings from different sources like Relational database tables, flat files, etc.Extracted and transformed data using various Informatica transformations and loaded them to target Responsible for providing Techincal development and support to major End to End ETL migration rollout Participated in Requirement Walk through sessions, provided technical support to ETL developers, monitored & tracked the progress in regular basisWrote complex SQL queries to extracct data and submitted reports to business for validationTook part in ETL mapping reviews and suggested best practises. Ensured the quality of ETL mapping deliverablesConsulted with client on ETL Process design for better maintenance and provided pros & cons based on scenariosProposed and made quick fixes to Production issues to ensure smooth functioning of the applicationDocumented handling of ETL restart scenarios for various business process to recover from failures Improved the performance of several powercenter mappings which was running in production for long hours, this change drastically brought down the time taken to run entire batchCatered for urgent business change requests after go-live and ensured stakesholders are informed of the changesProactively prepared a list of commonly occuring issue and took steps to correct them. Performed continuous improvements to ensure good quality of data is maintained at all times Show less

    • Fiserv

      Oct 2015 - now

      Creates ETL design and handles end to end Development using Informatica Power Center 9.x/ Informatica Cloud for catering to various Business Intelligence Requirements of First Data Brazil Developed end to end mappings, workflows to integrate data from various sources like Flat files/Oracle database to Target DB2 system with good performance.Proficient with Informatica administration activities - created a new Datawarehouse Informatica Environment with Integration services on Grid and involved in setup of users & access.Used Informatica Cloud REST APIs and performed loading in Amazon Redshift through AWS S3. Good knowledge of AWS EC2, S3 and RedshiftCreates Python Scripts using packages such as Numpy, Matpotlib, Pandas. Schedules jobs using Control-MWorked on Informatica Data Quality using Informatica Developer tool. Used Address Doctor to validate addressDue to new Finanacial institution getting added to Brazil Acquiring Platform, involved in setting up of new Enterprise Data Warehouse / Informatica infrastructure Studied existing business processes and organizational procedures and transformed them to technical solutionsAnalyzed and wrote detailed description of user needs, and documented steps required to develop BI ApplicationsImplemented hybrid Data warehouse methodologies - Used Data Vault Methodology (Hub, Link and Satellites) for Core/history layer and Dimensional modelling methodology for Presentation LayerParticipated in Requirement walkthrough and Joint application design calls to understand and brainstorm different approaches with there pros and cons.To handle huge volumes, developed SQL and Informatica Methodologies to process them on realistic time framesImproved the performance of several powercenter mappings which was running in production for long hours, this change drastically brought down the time taken to run the entire batch. Provided implementation and technical support for UAT and Production deployment Show less

      • Data Architect

        Nov 2020 - now
      • Business Intelligence - Manager IT

        Oct 2015 - Oct 2020
  • Licenses & Certifications

    • Big Data Foundations - Level 1

      IBM
      View certificate certificate
    • IBM Certified DB2 Fundamentals

      IBM
    • Quest Badge - AWS Solution Architect Associate

      Qwiklabs
      Sept 2017
      View certificate certificate
    • Python Programmer

      DataCamp
      Apr 2018
      View certificate certificate
    • IBM Certified DB2 Database Application Developer

      IBM
      Apr 2014
    • Informatica Cloud Certified 201 badge

      Informatica
      Aug 2017
      View certificate certificate