Job Description
Introduction Join our dynamic team to design, develop, and program innovative methods and systems that consolidate and analyze diverse "big data" sources.
Your work will generate actionable insights and solutions for client services and product enhancement.
Collaborate with product and service teams to identify questions and issues for data analysis and experiments, and develop cutting-edge software programs, algorithms, and automated processes.
Required Skills & Qualifications Applicants must be able to work directly for us on a W2 basis.
Expert-level SQL skills.
Proficiency in Python (standard libraries).
Experience with Apache Spark.
Familiarity with AWS and DataBricks.
Experience in developing data governance standards.
Ability to evaluate and utilize new technologies, tools, and frameworks.
Strong problem-solving skills and initiative to explore and solve complex issues.
Day-to-Day Responsibilities Lead key goals across consumer and commercial analytics functions.
Work with stakeholders to understand requirements and develop sustainable data solutions.
Document and communicate systems and analytics changes to the business.
Client’s Marketplace Coverage Correction Factors (MCCF) product is a data science solution designed to estimate total marketplace sales at a detailed product level, particularly in areas where Client does not have direct access to retailer point-of-sale (POS) data.
The MCCF product leverages advanced modeling to “gross up” known sales data from mapped accounts and predict sales for unmapped accounts, helping Client gain a comprehensive view of marketplace performance.
This project is essential for supporting business decision-making and optimizing Client’s marketplace strategy.
Lead to the accomplishment of key goals across consumer and commercial analytics functions.
Work with key stakeholders to understand requirements, develop sustainable data solutions, and provide insights and recommendations.
Use data to predict trends and perform statistical analysis.
Monitor data quality and remove corrupt data.
Evaluate and utilize new technologies, tools, and frameworks centered around high-volume data processing.
Validate key performance indicators and build queries to measure business performance.
Develop SQL queries and data visualizations for ad-hoc analysis requests and ongoing reporting needs.
Monitor data quality and remove corrupt data.
Design and build innovative data and analytics solutions to support key decisions.
Company Benefits & Culture We are committed to fostering a diverse and inclusive work environment.
Encourage continuous learning and development.
Promote a collaborative and innovative culture.
For immediate consideration please click APPLY to begin the screening process.
Min Qualification
• Required Skills & Qualifications Applicants must be able to work directly for us on a W2 basis
• Expert-level SQL skills
• Proficiency in Python (standard libraries)
• Experience with Apache Spar...
• Required Skills & Qualifications Applicants must be able to work directly for us on a W2 basis
• Expert-level SQL skills
• Proficiency in Python (standard libraries)
• Experience with Apache Spark
• Familiarity with AWS and DataBricks
• Experience in developing data governance standards
• Ability to evaluate and utilize new technologies, tools, and frameworks
• Strong problem-solving skills and initiative to explore and solve complex issues
see more
Applying Instructions
• CareerBuilder
• CareerBuilder
see more
Work History and Feedback