I am highly energetic, ambitious person who has developed a mature and responsible approach to any task that I undertake, or situation that I am presented with. Capable of learning new technology and using the knowledge to build defensive solution in a critical situation. Having good leadership and teamwork qualities.
Working on web scrapers development and configuration using scrapy, selenium
Preparing various APIs to analyze daily and historical data using django rest framework
Working on historical data migration using pandas
Worked on configurations of ODOO structure according to business models
Prepared various connectors to sync data among ODOO, other ERP systems and cloud storages like Magento, Shopify, Amazon, Google Drive, DocuSign, Right-Signature etc
Communication with client form requirement gathering to deployment
Company Profile:
Qc technologies is a service based company which provides python, django
flask and javascript services Qc technologies positions itself as a software house delivering quality
products while meeting best professional standards and commitment, without compromising its
values and team's well being.The company have experience in web applications, automated software
testing and back end server processes, API development, applications braided with third party
services. The company have helped various startups in US
Key Responsibilities:
Working as Full stack developer on Innovations
Preparing design documents containing ERD's and possible solutions technical details
Minimize bugs through TDD(Test Driven Development) approach
Communication with client form requirment gathering to deployment
Implementation of Django Channels in Innovations
Worked on many new features in Innovations
Working as a developer in TechScope on project of pakchinanews.pk
http://pakchinanews.pk
Developed a Research Tool Literature Review Assistant:
A systematic literature review assistant, a powerful web based tool for
researchers to find research papers as an helping material, follow other researchers get
updates, save searches, filter search as per user requirement and visualization of
searches as graphs.
An intelligent filtering algorithm, a tagging mechanism, a multi-threaded server capable
of handling remote and local clients, an analytical tool capable of drawing statistical
analysis on the search queries and representing data in graphical format.
Backend web crawlers designed dynamically to crawl the digital repositories for temporal
increment and as a whole.
Modules: Website (frontend), Web API, Web Crawler
Technology: Asp .Net, MVC, MS SQL, AJAX, JavaScript, Selenium, R language.
Tools: Visual Studio 2013, SQL Server 2012, Zotero