I am a tech-savvy professional with a passion for critical programming analytics and an eagerness to work with the latest technologies. I am deeply committed to exploring opportunities in core software development and artificial intelligence and have extensive hands-on experience with various Unix operating systems. I excel in processing and optimizing results through data transformation pipelines and have a strong understanding of different Business Intelligence tools.
• Conduct analysis of existing Oracle database structure to determine migration requirements and plan for migrating data to PostgreSQL • Write Perl scripts to extract data from Oracle databases and prepare it for migration to PostgreSQL• Develop, implement, and maintain data warehousing solutions to store and analyze large amounts of data• Utilize BI tools and technologies to create dashboards, reports, and visualizations that provide valuable insights and inform data-driven decision-making• Design, develop, and maintain Informatica workflows and mappings to extract, transform, and load data• Collaborate with stakeholders to understand data needs and design effective data solutions• Debug and troubleshoot Informatica errors and resolve any performance issues• Evaluate and implement new Informatica technologies and tools to improve data pipeline functionality and reliability• Collaborate with stakeholders to understand their data needs and design effective data solutions• Design, develop, and implement data pipeline architecture to extract data from SAP systems and load it into Apache Superset • Map SAP data fields to Apache Superset data fields and design data transformation logic to ensure data is properly converted • Develop and maintain scripts and tools to automate data migration and ensure the accuracy and consistency of data • Develop custom solutions to meet specific needs and requirements of the organization • Collaborate with software engineers to integrate scripts and migration solutions into the overall application architecture • Evaluate new technologies and tools that can improve the functionality and reliability of the migration process • Monitor migration performance and optimize data pipelines as needed to ensure data is loaded in a timely and efficient manner• Write detailed documentation on the migration process and provide training and support to stakeholders as needed.
I have extensive experience working with Python automation development Software, excellent coding skills, and a good eye for detail and efficient in design,and code
Writing reusable, testable, and efficient code using Object-Oriented Methodologies.
Build scalable and reliable data systems consisting of multiple modules and sub-modules.
Integrate back-end python-based modules with user-interface elements developed by front-end developers.
Enhance Python pipeline to capture key elements from the client systems and automate verification of algorithm findings.
Designing, building, testing, and deploying services and APIs that will be integral to our products.
Evaluating, and driving new opportunities through expert comprehension of software design, APIs,and web technologies.
Integration with front-end technologies PHP, Laravel, etc.
I Have a extensive working experience as a Data Analyst |Ai Engineer & developing highly scalable Data Science & Machine Learning/Deep Learning-based applications and services
Doing Machine Learning end to end model deployment on cloud (AWS, Azure,Heroku) Expertise in validating the data using EDA/ETL Techniques: Central Tendency, Dispersions, Quartiles/Percentiles, Standardization and Data Visualization
Proficient in understanding and analyzing/visualization of data pertaining to various domains, build best-fit models based on the data and providing appropriate insights to business problems Excellent problem-solving skills, with attention to detail and focus on quality
Solid understanding of mathematical foundations behind Machine Learning algorithms, and comfortable with discussing them in details Solid programming skills with C, Python and its libraries like NumPy, Pandas,Matplotlib, Seaborn, Scikit-Learn, Keras, Tensor Flow, OpenCV, etc.
Experience implementing multi-purpose machine learning algorithms
Passionate about working on large datasets and able to design high-value-added solutions to businessstakeholders coordinating well with internal teams and clients with agile methodol
Develop tools, infrastructure and frameworks to increase the automation productivity of the team and addquality to our products
Create automated functional, regression, and stress test infrastructure for game, web and platform features Help educate and provide leadership on how to leverage existing tools and build their own automated testing solutions Identify, troubleshoot, and resolve issues by working with the entire development team, including adding instrumentation and doing debugger-based analysis