top of page
  • Writer's pictureBalaji Alwar

Building Skillbase Using the Build-Measure-Learn Cycle

Balaji Alwar, Research Fellow, Harvard Kennedy School, and Shireen Yacoub, M.Ed Candidate, Harvard Graduate School of Education


--

Want to learn more about the Skillbase origin story? Check out our first blog post in the series here

--


At Skillbase, an initiative of the Project on Workforce at Harvard, we are on a mission to connect family, friends and coworkers to online learning that can help them advance their careers. Today, with millions of learning resources online, it’s hard to know where to start or which resources will lead to workplace success.


On the Product Management team at Skillbase, our goal is to build a product that helps users learn practical skills for current or future jobs. We aim to leverage the strengths of a diverse and unique cross-Harvard team and forge a unique collaboration environment.


Together, we seek to build a tool that is value-add to the broader education and workforce ecosystem. Rather than replace human coaching and support, we aim to complement the work of staff and organizations who support jobseekers in an increasingly virtual world.


Build: Create a Minimum Viable Product

We derived Objectives & Key Results (OKRs) for a Minimum Viable Product (MVP) from Skillbase’s mission to help learners make successful job transitions and advance their careers. These priorities served as our north star in building the MVP:

  • Providing access to high-quality, completely free resources.

  • Enabling easy navigation of existing resources through curation.

Prioritizing data and speed in the context of the COVID crisis, our goal was to build an MVP in four months. In fact, we exceeded our goal and launched a fully-functional MVP in three months. We prioritized three key results in the build:


1): The platform is accessible to a wide variety of users with different needs and preferences.

2): The platform provides an engaging user experience by prioritizing user retention

3): The platform captures outcomes that measure impact on users over a longer period of time. Engagement with the platform improves career outcomes for the average engaged user through the attainment of skills, knowledge, or improved confidence about job transitions.

Features and Functionality:

The earliest Skillbase features in our MVP were rooted in a set of hypotheses about who our users were and what they needed. The product would need to effectively aggregate and curate free online learning resources from different third-party sources.


For instance, we hypothesized that users valued simplicity in design and navigation. Therefore, we created curated pathways with filtering mechanisms that could help users find content meeting specific criteria. In addition, we thought that some users might need help figuring out where to start. Therefore, we created questionnaires and diagnostics to help understand their learning needs, recommend relevant pathways, and navigate the site.


The MVP included the following product features:

  • Curated pathways and click-outs to resources from third party content providers like Google Applied Digital, Accenture, LinkedIn Learning, and Coursera. Content was organized in three categories - Career and Job Search skills, Learning English for Work and Digital skills.

  • Filtering mechanism to identify resources based on criteria we believed to be valued by our users- such as short duration, mobile-friendly, offering a certification and not requiring additional login/signup.

  • Diagnostic questionnaire tool to help users identify learning pathways suited to their needs.

  • Translation option to translate site from English into Spanish or Portuguese.

  • Chat option to directly connect with Skillbase team for support.

  • Sign up and login options for users to track prior content.




Measure: User Testing and Experiments

In this phase, we set out to understand users’ interaction with our solution. We ran live experiments, tracked quantitative data through the website's data analytics, and aggregated qualitative data through user interviews and testing.


Experiments:

To better understand user behavior with the tool, we needed to test our assumptions and early hypotheses.


We started tracking a series of metrics to help us make sense of user engagement, validate core assumptions, and identify key friction points. To understand usability and value proposition, we analyzed average session time (time spent on the site) and bounce rate (the number of visitors dropping off without a single click). To get feedback on the curated resources, we tracked user resource clicks to understand their preference for the types, lengths, and content format. We tracked external clicks per session to understand the tool’s effectiveness in guiding users to the right resource for their needs. To understand whether users were willing to log in and receive personalized recommendations, we tracked the number of sign-ups/logins completed. To understand platform preferences and primary modes of interaction with Skillbase, we tracked the number of mobile and desktop users.


We also ran a series of A/B tests on the MVP to help us understand how to best optimize User Interface (UI) elements and improve the overall user experience. We will highlight some of the takeaways from these A/B tests in the next post in the blog series on product learnings.


User Testing and Interviews:

We leveraged partnerships with organizations across Greater Boston to generate feedback for the tool through individual and group user testing. We partnered with JVS, Boston’s largest workforce development organization, and MassHire, the state public workforce system. These partners, as well as other organizations, helped us develop a plan for customizing and distributing the tool based on staff and customer needs. We tested Skillbase with staff and solicited valuable feedback around the product, content, distribution, and promotion strategies. Feedback enabled us to continue to refine and improve our approach to reach target users. Some of the guiding questions included during our engagement with users and partners include:

  • Integration: How can the Skillbase tool most effectively augment existing career center systems to drive improved outcomes for jobseekers?

  • User Needs: What types of content do jobseekers find most relevant and useful in their reemployment journey and what formats do they prefer?

  • User Profile: What are the common user habits, sources of motivation and challenges with regards to upskilling and the job search?

  • Impact: In what ways can a virtual tool best connect jobseekers to relevant training resources in an environment of limited human and training dollar resources?

  • Variation: Are there demographic or regional differences in how jobseekers engage with online learning content or in the demanded topic areas?

  • Access: Who is accessing the tool? How frequently are they accessing it? Are they returning on a regular basis?

In our next blog post, we will share some of our learnings from running the Build-Measure-Learn Feedback cycle. We will share some of the key findings from our experiences, insights about users, target population, content preferences, and effective distribution strategies. We will also highlight how the Build-Measure-Learn Feedback cycle informed the design and development of the next iteration of the Skillbase platform!



-----

Interested in partnering with Skillbase? Learn more about partnerships here

Interested in learnings from the Build-Measure-Learn Feedback loop? Check out our third blog post here.



bottom of page