Arunachalam Sivananthan

Arunachalam Sivananthan

Developer & Design lead

location of Arunachalam SivananthanChennai, Tamil Nadu, India

Connect with Arunachalam Sivananthan to Send Message

Connect

Connect with Arunachalam Sivananthan to Send Message

Connect
  • Timeline

  • About me

    Senior Manager Databricks

  • Education

    • Raja Rajeswari Engineering college

      1999 - 2003
      Bachelor’s Degree Computer Science First Class
    • Vivekananda Vidyalaya College of Education

      1992 - 1997
      Schooling
  • Experience

    • Coromandel Infotech India Limited

      Sept 2005 - Dec 2008
      Developer & Design lead

      Hattan National Bank is one of the leading banks in Srilanka having more than 120 branches. This project deals with preparation of menus for generating reports related to Trade Finance. The banking solutions had been provided through the CORE BANKING SOLUTIONS software from INFOSYS – FINACLE in which Coramendal INFOTECH is a value partner. BOB is fifth largest bank in India having network of more than 2800 branches globally. Finacle is the Banking application from Infosys used by BOB for their Account and Operations through which new reports to be added as part of Application.Tools used Oracle SQL, PL/SQL, Shell scripting.Roles & Responsibilities• As an onsite coordinator for Srilanka and Uganda projects, delivered and coordinated the tasks between onsite & offshore teams.• Involved in principal discussion for studying the business functionality and requirement gathering for banking reports.• Involved in development and customization of Banking Reports.• Customizing Reports as per the requirement of the bank with help of SQL, Pl/sql and MRT (a tool by Infosys that combines UNIX and Oracle.• Involved in Customizing Bank new requirements using Scripting – A tool provided by Infosys that combines UNIX and Oracle and testing the same with bankers to get the actual outputs.• In charge of the team for customization of the “Regulatory Reports” required by Bank of Baroda, Uganda. The role involved the analysis of each requirement and preparation of the “Requirement Specification” documents in discussion with bankers and final Signoffs of these documents, development of these reports.• Provided support in the UAT (User Acceptance Testing), SIT (System Integration Testing). The role requires good understanding of the application combined with knowledge of banking concepts.• Responsible for the Day-End and Day-Begin activities of the bank, that involves various processes that require good understanding of the application of FIANCLE and general banking. Show less

    • Cognizant

      Jan 2009 - Aug 2017

      Annual service for British Gas services customers has to happen on a yearly basis depending on their visit dates. Since multiple systems are involved and due to replication issues ASV was not happening as required for customers having data issues. As part of this project Data is extracted and analyzed from different systems in Hadoop and data issues were fixed on source system.Tools used Hadoop – HDFS, Hive, Shell scripting, SAP – CRM & ISU, Qlikview.Role and Responsibilities• Lead a team of Data analysts and coordinating with business for Requirements and solution design.• Implementing the extracts in Hadoop using Hive and shell scripting.• UDF’s were created in java for additional functionality needed.• Coordinate with support team for deployment and support.• Fine-tuned overall system performance for ASV, P0 Extract and resolved performance bottleneck issues effectively.• Accountable for deploying applications into production, from strategic design all the way to development and handling daily operations of Hadoop environments. • Coordinate with SAP team for fixes file requirement and automated fixing. Show less British Gas customer and utility data is stored in multiple databases and was periodically refreshed into one database namely Microsoft SQL server where reporting and analysis was carried out. Since the deployment of our big data platform (Horton works) all customer and utility data has been moved to HDFS. British Gas took the strategic decision to decommission Microsoft server and build a reporting framework that would identify and monitor incorrect data that would require fixing.Tools used Hadoop – HDFS, Hive, Map reduce, Sqoop, Spark, Shell scripting, Python & R.Role and Responsibilities• As a Technical architect responsible for designing the framework using Big data technologies.• Data model design for the framework using Visio. The snowflake pattern (fact and dimension) was used.• Implementing the data model using hive scripts. All the tables were created in HDFS.• Queries were written in hive to identify incorrect data.• A framework was built to populate the data model in order to monitor the incorrect data in a structured manner so there is a visibility of how many have been fixed since the last run, how many are new instances and how many were not fixed. This was developed using hive.• A scheduler framework was created using python that would read configuration files and output a shell script that would execute the framework.• Visualization (trend graphs and pie charts) were created using R by connecting to hive.• Sqoop was used for transforming data for fixes and exporting to Relational data• Map reduce programming was used to do exception log analysis and for adhoc files received by business.• Enhancement to existing framework is being developed using Spark.• Involved in project plan creation (Governance discussion, Management forums with customer)• Manage and resolve queries, escalations, conflicts and issues across onsite/offshore teams. Show less Catalyst Data migration & Data cleansing is one of Europe’s biggest migration from myriad of systems, including Siebel databases and bespoke or highly-modified software to SAP CRM & ISU.Tools used SQL server, Oracle SQL, PL/SQL, Shell scripting, Pro*c.Role and Responsibilities• Worked across multiple SCRUM teams and responsible for analysis and measurement of Data in Legacy, Siebel and SAP CRM using Sqlserver.• Identified and resolved data issues for migrating existing legacy system to CRM• Provided optimal solution design on data fixes in SAP-ISU & CRM• Resolved data quality issues, organized and categorized data for migration in Legacy, SAP- ISU, CRM & Siebel systems• Validating contacts and handled exceptions before inputting into a bespoke CRM• Prepared data submissions in accordance with laid down data schemas and resolved issues of mismatched data.• Analysis and fixes of Data in source system using Oracle SQL, PL/SQL, Shell scripting.• Responsible for mapping of data between Source and legacy systems. Show less Winback Processing is used to get back the BGS customers, who have cancelled their Service contracts, by sending Winback letters. The current Winback processing will be extended to include Insurance and Care Fallback contracts. Insurance Winbacks project will simplify the current rule configuration of Winback processing by providing a default cancellation code, to allow the single rule to be applied for all Winback enabled cancellation codes and will provide a media code lookup in the Winback Letter Cycle Event to populate and send the media code directly to the pricing engine for Winback Quote pricing.Tools used Oracle 9i, PL/SQL, Unix Shell scripting, Pro*CRole and Responsibilities• Involved in full SDLC cycle including Design, Development, Testing and enhancements of various modules in the project.• Responsible for getting the requirements from Client and onsite teams.• Involved in the Issue Tracking and Bug fixing in the same area.• Conduct peer reviews during design, coding and testing. Show less

      • Big Data lead & Developer

        Aug 2015 - Aug 2017
      • Big Data Developer

        Nov 2013 - Aug 2017
      • Data Team lead & Analyst

        Dec 2010 - Oct 2013
      • Offshore developer & Coordinator

        Jan 2009 - Nov 2010
    • Accenture

      Aug 2017 - Aug 2019
      Associate Manager at Accenture

      Skills:Big Data HD Insights,Microsoft Azure,Machine learning,RPF Estimates, SME Support

    • Capgemini

      Aug 2019 - Feb 2021
      Senior Manager / Big Data & Azure Architect

      Azure Architect

    • Databricks

      Apr 2021 - now
      Senior Manager
  • Licenses & Certifications

    • Oracle SQL & PL/SQL Developer certification

      Oracle
    • DP 203 Data Engineering with Azure

      Microsoft
      Nov 2021
    • Certified Scrum master

      Scrum Alliance
    • Databricks Certified Data Engineer Associate

      Databricks
      Aug 2022
    • Databricks Certified Associate Developer for Apache Spark 3.0

      Databricks
      Nov 2022
      View certificate certificate
    • Databricks Certified Data Engineer Professional

      Databricks
      Jul 2024
      View certificate certificate
    • Analyzing and Visualizing Data with Microsoft Power BI

      Microsoft
      May 2020
    • DP201 - Designing an Azure Data Solution

      Microsoft
      Nov 2019
    • HDP Certified Developer

      Hortonworks
      Sept 2016
      View certificate certificate
    • 70-774 Perform Cloud Data Science with Azure Machine Learning

      Microsoft
      Jan 2019
    • DP200 - Implementing an Azure Data Solution

      Microsoft
      Oct 2019