Discovery Track
Understanding the problem space, identifying opportunities, and defining the product’s vision, considering various perspectives such as employees, users, customers, and metrics.
Identifying the Problem
-
Understand pain points from all perspectives - employees (internal processes), users (actual product usage), and customers (needs, desires).
-
Conduct user interviews with employees and customers to gather feedback.
Tools: SurveyMonkey (gather feedback from customers and users), UserInterviews (user research and testing), Hotjar (analyzing user behavior on websites and apps)
Conducting Market Research
-
Analyze key market metrics, trends, and customer behavior.
-
Perform competitor analysis and review industry reports.
-
Look at employee efficiency metrics and how they can be improved via the product.
Tools: Crunchbase (competitor analysis), CB Insights (market trend and competitor intelligence), Gartner (industry reports and market insights)
Validating Ideas
-
Collect feedback through testing, focusing on user experience (UX) and internal stakeholder feedback
-
Use metrics to validate the feasibility of ideas.
Tools: Lookback.io (user testing and feedback), InVision (prototyping and gathering feedback), Miro (brainstorming with employees and teams)
Defining Product Vision
-
Align the product vision with business goals, ensuring it addresses user needs, employee pain points, and customer desires.
-
Use metrics like NPS (Net Promoter Score), engagement data, and employee feedback to inform the product vision.
Tools: Miro (collaborative vision mapping), Aha! (product vision and strategic planning), Trello (aligning team members around product goals)
Delivery Track
Turning the product concept into a working solution and launching it to the market.
Defining Roadmap
-
Create a prioritized product roadmap based on business goals, customer feedback, and development capacity.
-
Ensure alignment with company objectives and key milestones.
Tools: Jira (managing product roadmaps and sprints), Aha! (product roadmap planning and feature prioritization), Trello (task management and roadmap visualization)
Working with Development Teams
-
Collaborate with engineering, UX/UI, and design teams to build the product iteratively.
-
Ensure that the development team has all the necessary requirements to meet the product's vision.
Tools: Slack (real-time communication and coordination), Confluence (documentation and collaborative knowledge sharing), Figma (design collaboration and prototyping with teams)
Managing Releases
-
Oversee the product release cycle, ensuring timely and high-quality delivery.
-
Coordinate with QA, operations, and deployment teams to ensure smooth releases.
Tools: GitHub (version control and release management), GitLab (CI/CD pipelines and release management), Jenkins (continuous integration and automated testing)
Ensuring Quality
-
Maintain a focus on product quality, ensuring it meets user expectations and business objectives.
-
Work closely with QA teams to perform thorough testing and identify any issues before launch.
Tools: TestRail (managing test cases and tracking results), Selenium (automated testing), Bugzilla (bug tracking and issue management)
Data Track
Covers the complete lifecycle from raw data sourcing to AI model deployment and performance monitoring.
Data Sourcing & Collection
-
Define the data needed to support the product or model.
-
Choose sources like internal systems, open data, or third-party providers.
-
Check that the data is accurate, complete, and legally compliant.
-
Set up infrastructure to access and collect data efficiently.
-
Assign ownership and document each dataset properly.
Tools: SQL, Snowflake, BigQuery (query and access structured internal data), Airtable, Google Sheets (organize and collaborate on small datasets), AWS S3 (store and manage large-scale data securely), APIs, Web Scrapers (collect data from external platforms and websites)
Data Preparation & Feature Engineering
-
Clean, transform, and structure data; feature extraction for model input.
-
Feature selection, data splitting, versioning & documentation
-
Store unstructured data in data warehouse for future usage
-
Partner with DS and data engineers to unblock data quality or access issues.
Tools: Pandas, SQLPlus (data wrangling), Jupyter Notebooks (exploration and transformation)
Model Development, Evaluation & Launch
-
Define modeling goals in collaboration with data scientists to align on what the model should predict, recommend, or classify.
-
Set success criteria and acceptable performance thresholds based on business impact (e.g., prioritize precision over recall).
-
Train models using appropriate algorithms and evaluate them using the right metrics such as accuracy, precision, recall, F1 score, and fairness.
-
Package the approved model, deploy it via batch or real-time systems, integrate with products, and set up version control and monitoring with DevOps.
Tools: Jupyter Notebooks(experimentation review & comparative analysis), Excel/Google Sheets (simplified validation reporting), Docker, Kubernetes (containerize and orchestrate model deployment), AWS SageMaker (model deployment in cloud)
Monitoring, Feedback & Optimization
-
Monitor model performance and system health in production using key metrics.
-
Detect data or concept drift that may impact model accuracy.
-
Collect user and system feedback to identify areas of improvement.
-
Plan periodic evaluations and retraining using updated data.
Tools: Grafana (track live model and system metrics), SageMaker Model Monitor (for model performance in AWS)
