Job Description: The candidate must have the unique capability to create, design, and build systems for collecting, storing, and analysing data at scale & implement methods to improve data reliability and quality, which would be used to transform data into knowledge. They should have a good verse of different data mining techniques, data exploration, data modelling, training & Validation, and model deployment.
Experience Level: 2-5 years
Technical Skills Required
- Programming language – Python, SQL, Java, R(optional), SAS(optional)
- Data Science & Machine Learning Knowledge
- Statistical Modelling knowledge
- Strong Knowledge of Data modelling, data mining, and segmentation techniques
- MLOps / Model Management and Deployment in Cloud or On-Premises.
- Worked on end-to-end implementation of Data Science projects in the Cloud as well as on-premise.
- Knowledge of Apache Spark / Apache Kafka for big data processing.
- Knowledge of Time Series Forecasting, NLP, Computer Vision, Transfer Learning, and Reinforcement Learning is a plus
- Experience with using cloud services (AWS/ Google/ Azure) and cloud concepts (IaaS, PaaS, and SaaS)
- Good understanding of databases core concepts and familiarity with SQL and NoSQL databases (e.g., MySQL, MongoDB)
- Awareness of data warehouse and ETL concepts.
- Working with any of these data science tools: SAS, Dataiku, Azure ML Studio, Amazon Sage Maker, DataRobot, and KNIME Analytics will have added advantages.
- Comfortable as a single contributor as well as a team player
- Strong problem-solving and analytical skills
Nature of work:
- Work on Data Analytics professional service projects, and POCs based on business requirements.
- Provide data preparation, data modelling and deployment solution with Altair portfolio
- Build applications that collect, manage, and convert raw data into usable information for data modelling, data visualization and real-time analytics.
Apply for job
To view the job application please visit phh.tbe.taleo.net.