
Abhiram Reddy
Power BI/SQL Developer

Connect with Abhiram Reddy to Send Message
Connect
Connect with Abhiram Reddy to Send Message
ConnectTimeline
About me
Data Engineer@Capital Group | Actively looking for Remote C2C opportunities | Data Engineer | AWS | Azure | GCP | ETL | Python | SQL | Big Data | Snowflake| Power BI | Glue | Redshift | Data Lake | Pyspark | Kafka
Education

JNTUH College of Engineering Hyderabad
2011 - 2015Bachelor of Technology - BTech Computer Science
Experience

Cognizant
Aug 2015 - Jul 2016Power BI/SQL Developer* Developed ETL framework for processing and monitoring import of flat files in SSIS for key clients.* Maintained SQL scripts and complex queries for analysis and extraction, Developed Stored Procedures to generate various Drill-through reports, Parameterized reports, Tabular reports, and Matrix reports using SSRS.* Performed Branching, Merging, Tagging, and Release Activities on the Version Control Tool, GIT.* Design, build, and deploy BI solutions using Power BI.* Created scripts to automate in-house metadata explorer project using Shell Script and SQL.* Created and loaded multiple Metadata resources from the data warehouse.* Worked with various scripting languages like Shell and Python, Wrote Python scripts for pushing data from Mongo DB to SQL Database.* Created Calculated Columns and Measures in Power BI and Excel depending on the requirement using DAX queries.* Involved in the installation, configuration, and development of SSIS packages.* Optimized the database by creating various clustered, and non-clustered indexes and index views. * Using aggregate strategies to aggregate data, sorting, and joining tables.* Responsible for creating and maintaining Analysis Services objects such as cubes, dimensions, and measures.* Created report model on SSAS cubes as well as changing default configuration on existing cubes. Show less

Genpact
Aug 2016 - Apr 2018Python Software Developer* Worked with the stakeholders, gathered requirements, and developed high-level design documents.* Designed and developed components using Python.* Implemented code in Python to retrieve and manipulate data.* Used MySQL database. Implemented database access libraries and APIs to fetch required info.* Re-engineered various modules for implementing changes and creating an efficient system.* Developed rich UI web applications using JavaScript libraries like jQuery UI.* Used Restful APIs to access data from different suppliers.* Support the script configuration, testing, execution, deployment, and run monitoring and metering with shell programs.* Used Restful APIs to gather network traffic data from servers.* Supported Apache Tomcat web server on the Linux Platform.* Performed debugging and unit testing.* Developed and executed the User Acceptance Testing portion of a test plan. Show less

Homesite Insurance
Aug 2018 - Jun 2020Jr Python SQL/Data Engineer* Configured Flume to collect data from multiple sources simultaneously, including web servers, mobile apps, and network devices.* Developed a data platform from scratch and took part in the requirement gathering and analysis phase of the project in documenting the business requirements.* Worked in designing tables in Hive, and MYSQL using SQOOP and processing data like importing and exporting databases to the HDFS, involved in processing large datasets of different forms including structured, semi-structured, and unstructured data.* Implemented a 'serverless' architecture using API Gateway, Lambda, and Dynamo DB and deployed AWS Lambda code from Amazon S3 buckets. Created a Lambda function and configured it to receive events from your S3 bucket.* Used IAM for creating roles, users, and groups and implemented MFA to provide additional security to the AWS account and its resources. AWS ECS and EKS for docker image storage and deployment.* Contributed to the implementation of ETL processes and data integration workflows using tools like Apache Flink, Apache Airflow, and Informatica, improving data management capabilities.* Ensured the highest standard of data integrity and security in all data-related activities, adhering to data governance practices and data quality frameworks.* Developed rest APIs using Python with Flask and Django framework and the integration of various data sources including Java, JDBC, RDBMS, Shell Scripting, Spreadsheets, and Text files.* Created AWS Lambda functions using Python for deployment management in AWS and designed and implemented public-facing websites on Amazon Web Services and integrated it with other applications infrastructure.* Used Amazon EMR for MapReduce jobs and tested locally using Jenkins.* Used AWS data pipeline for Data Extraction, Transformation, and Loading from homogeneous or heterogeneous data sources and built various graphs for business decision-making using the Python matplot library. Show less

Mastercard
Jun 2020 - Jun 2022Mid Level Python SQL/Data Engineer* Experience in building and architecting multiple Data pipelines, end-to-end ETL, and ELT processes for Data ingestion and transformation in GCP.* Strong understanding of AWS components such as EC2 and S3.* Implemented a Continuous Delivery pipeline with Docker and Git Hub Worked with g-cloud function with Python to load Data into Bigquery for on-arrival CSV files in the GCS bucket.* Process and load bound and unbound Data from Google pub/sub topic to Bigquery using cloud Dataflow with Python.* Designed the data models to be used in data-intensive AWS Lambda applications which are aimed at doing complex analysis and creating analytical reports for end-to-end traceability, lineage, and definition of Key Business elements from Aurora.* Played a pivotal role in automating data collection, analysis, and reporting processes, reducing manual effort and enhancing productivity.* Devised simple and complex SQL scripts to check and validate Dataflow in various applications. Performed Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export through Python.* Created Lambda functions to run the AWS Glue job based on the AWS S3 events.* Used various AWS services including S3, EC2, AWS Glue, Athena, RedShift, EMR, SNS, SQS, DMS, and Kenesis.* Extracted data from multiple source systems S3, Redshift, RDS and Created multiple tables/databases in Glue Catalog by creating Glue Crawlers.* Use Lambda functions and Step Functions to trigger Glue Jobs and orchestrate the data pipeline.* Developed and deployed data pipelines in the cloud such as AWS and GCP Performed data engineering functions: data extraction, transformation, loading, and integration in support of enterprise data infrastructures, data warehouse, operational data stores, and master data management.* Worked on AWS Elastic Beanstalk for fast deploying various applications developed with Java, PHP, Node.js, and Python on familiar servers such as Apache. Show less

Teladoc Health
Jul 2022 - May 2024Senior Python Data Engineer* Implemented Installation and configuration of multi-node cluster on the Cloud using Amazon Web Services (AWS) on EC2.* Successfully implemented Confluent Kafka-based data pipelines that significantly improved data reliability, efficiency, and quality, resulting in more accurate analytics.* Designed the data models to be used in data-intensive AWS Lambda applications which are aimed at doing complex analysis and creating analytical reports for end-to-end traceability, lineage, and definition of Key Business elements from Aurora.* Integrated AWS Dynamo DB using AWS lambda to store the values of items and backup DynamoDB streams.* Designed data warehouses on platforms such as AWS Redshift, Azure SQL Data Warehouse, and other high-performance platforms.* Experience in creating Task definitions, which specify the tasks, Resource allocation (Fargate), services, and docker image on which the application is built for Elastic Container Service and ALB.* Implemented AWS Elastic Container Service (ECS) scheduler to automate application deployment in the cloud using Docker Automation techniques.* Used AWS ECS to run docker containers on ECS Fargate, and ECS on EC2 and implemented docker container logging with docker sidecar container.* Used Hibernate to store the persistence data into the PostgreSQL database and wrote HQL to access the data from the database.* Involved in developing front end (UI) of the application using Angular 4, BootStrap, JavaScript, JQuery, HTML5, CSS3.Active Coordination and Project documentation updates.* Followed Scrum Agile methodology for the iterative development of the application.* Involved in designing and creating the Database Tables using PostgreSQL Database.* Used AWS Glue for transformations and AWS Lambda to automate the process.* Worked on data on AWS cloud services i.e. EC2, S3, EMR, DynamoDB.* Created monitors, alarms, notifications, and logs for Lambda functions, and Glue Jobs using CloudWatch. Show less

Capital Group
Jun 2024 - nowData Engineer* Designed and implemented ETL SQL queries to efficiently ingest data from Amazon S3 into Snowflake, ensuring seamless data integration and high performance.* Developed Terraform scripts to automate the creation and management of Snowflake tables and Snowpipes, streamlining infrastructure as code (IaC) practices and enhancing deployment efficiency.* Engineered and automated data health monitoring processes using Snowflake stored procedures and tasks, ensuring data ingestion accuracy and timely identification of anomalies.* Oversaw the deployment of data pipelines across dev, QA, and prod environments, ensuring consistent and successful implementations using the harness data pipeline.* Contributed to the end-to-end development of ETL pipelines, optimizing data workflows and ensuring robust data processing from source to target.* Crafted complex SQL queries in Snowflake to monitor data ingestion accuracy and maintain precise volume counts, supporting data integrity and operational efficiency. Show less
Licenses & Certifications
- View certificate

JPMorgan Chase & Co. - Agile Job Simulation
ForageNov 2023
Recommendations

Titi yekoye
Studied at Swinburne University of TechnologyEthiopia
Sneegdharoop bhattacharyya
Data ScientistKolkata, West Bengal, India
Ram ganesh b
Portfolio Head - Service Delivery | Associate General ManagerGreater Chennai Area
Juan david valencia díaz
Strategy | Product development | Digital transformation | Critical thinking | Problem solving | AI |...Bogotá D.C. Metropolitan Area
Kavita garg
Vice Principal, Author, NCLEX Trainer.Dehradun, Uttarakhand, India
Mahesh raisinghani
ASM - Bajaj Finance LimitedJodhpur, Rajasthan, India
Ahmed mousa
Warehouse Manager at Al Arabia GroupEgypt
Jurni rayne
Owner at Gritz N WafflezLos Angeles, California, United States
Erendira gomez
Business Management Engineer / Strategic Financial ManagementVancouver, British Columbia, Canada
Brad thomas
Director Chronicle OnboardingMarana, Arizona, United States
Teresa hentschel
Agile CoachSchwäbisch Gmünd, Baden-Württemberg, Germany
Diana oropeza ayala
Operador telefónico I Hoteleria l Huéspedes I Actitud de Servicio I Gestión a detalle I Comunicación...Tizayuca, Hidalgo, Mexico
Kiran kumar
JAMF 100, 170 & 200 certified.. Apple Mac OS.. People Management.. Vendor Management…Incident Manage...Karnataka, India
Juan felipe anaya latorre
Marketing Digital | Digital Marketing | Digital Paid Media | Digital strategy | Performance Marketi...Bogota, D.C., Capital District, Colombia
Meshal alabdullatif
Senior Information System | Cybersecurity Student at Prince Sultan University | Skills & Community E...Riyadh, Saudi Arabia
Ângela oliveira
PsicólogaPorto, Porto, Portugal
Rishav shaw
Customer Service Support Advisor at BT GroupKolkata, West Bengal, India
Kieron donnelly
IT Incident Manager | Major Incident Management | Service Improvement | ITIL | ITSM | SC ClearedWarrington, England, United Kingdom
Samyak nayak
Backend SE @ Intentwise | IIIT-BH 20'Bengaluru, Karnataka, India
Joseph chee
.Singapore
...