Tina Nguyen
VP Design
Failing to success in new product development
How Fora is learning to fail and learn fast with experiment streams
Fora’s product development process has been going through a transformation over the past two years. We started from a founder-led process and transitioned to one designed for rapid experimentation and learning.
Our previous product development process relied on clear requirements and certainty from the beginning, followed by months of building an MVP before testing our hypotheses. As a result, mistakes were costly and teams didn’t get any meaningful user feedback for many months. Lab testing was rare. (Even if it had been more frequent, it would have given us insight into usability issues, but not definitive information about whether or not our target audience would use it — that can only be sussed out in production.) Once an MVP was complete and pushed to production, the strategy had already shifted or teams had already redeployed onto different projects without iterating on the MVP. Historically, Fora averaged 10 months per learning cycle. Despite desire and support from senior leadership to move more quickly, team members didn’t have the know-how or muscle memory to execute a more iterative approach.
As Fora began focusing more on new product development, we prioritized improving our learning rate so that we could deliver value more quickly to our communities. We designed a more nimble process based on design thinking as taught by Stanford’s d.school. Introducing organizational change is always difficult and we wanted to experiment with our process before truly committing to it. Most of our team members were unfamiliar with this way of working, and it wasn’t yet clear that this would work for Fora. We formed two teams to experiment with our products and the process itself, and we called them experiment streams.
Notable features included:
-
a pod anchored by Engineering, Product, and Design leads in a collaborative EPD partnership;
-
a flat team structure where there is no “boss” who dictates product direction;
-
high-level guidance from Fora leadership in the form of an initial “How Might We” question;
-
a 10-14 week timebox where the team is expected to create an MVP and determine if there is a viable path forward;
-
a pod size between 8-10 members from Eng, Product, Design, and other functions; and
-
a coach who has experience with new product development to guide teams through the process and monitor the team’s health, to foster psychological safety.
We staffed the two streams and gave them a high level of flexibility and trust. We had expected that the teams would be able to quickly form and test hypotheses and hoped that they would be able to build features that would significantly move the needle on some of our core metrics as well, but success wasn’t guaranteed. There was a lot of energy and excitement in trying something new, but also anxiety about the inherent risk in trying an unfamiliar process.
Benefits
Team members felt trusted and empowered to try out many different ideas, fail fast, and learn. Short iteration cycles allowed team members to quickly see the impact of their decisions and correct course. Something that might have previously taken three weeks to figure out, was realized in a day.
This was exciting because it empowered team members (and not just senior leaders) to think strategically. Amidst heightened chaos and ambiguity, team members also realized more growth and ownership. Engineers got involved much earlier in the product development cycle, which was exciting for some and uncomfortable for others. While all of our engineers were committed to solving challenging technical problems, some wanted to think deeply about the entire product experience.
Challenges
The experiment streams tried to do two complicated things simultaneously: find product-market fit and operate in a new bottom-up culture. The team was asked to behave in a way that felt foreign and, in some ways, was the opposite of what they were used to. The most challenging aspect was the mental shift from a structured, founder-driven product development process to a fast-paced, bottom-up process.
After interviewing all 18 members of the experiment streams from Engineering, Product, Design, Data, and Community Management, we learned that the team experienced challenges resulting from mismatched expectations, difficulty communicating, real and imagined constraints, and the lack of coaching.
-
Consulting vs. all-in: Team members joined with varied expectations of how experiment streams would work and the level of commitment required. Despite prior communication that team members who volunteered would need to devote at least 80% of their time to this effort, some were assigned instead of asked, and others had major projects that they continued to be responsible for. This created mismatched expectations and behaviors where some viewed themselves as consultants and observers, rather than full members who could and should speak their opinions, leading to significant frustration from others about their lack of involvement.
-
Not enough vs. too many: Each experiment stream was intended to have only 6-8 members (a PMs, a designer, and 4-6 engineers). However, both teams ended up with 9-12 members, which became unwieldy. Having too many members encouraged spectator behavior (in addition to some believing themselves to be consultants), often resulting in too many opinions from too few people. The teams were also heavily staffed with backend engineers who were not as familiar with lightweight prototyping and had a difficult time understanding what their roles were if there was no code to be written yet.
-
Reactive vs. proactive: Most team members were accustomed to directive leadership and executing with limited decision-making authority or influence over product direction. While some disliked the restrictiveness of that model, it provided structure and security. Experiment streams lacked that structure — they were based on design thinking, which only provided broad process guidelines. Absent clear directions, many team members struggled to demonstrate initiative; they were unable to bridge the gap between identifying problems, and taking proactive steps to address them.
-
External vs. self-imposed constraints: Technical limitations contributed to inertia. Today, we release multiple times a day, but back then, we had a rigorous but rigid deployment process that made rapid experimentation difficult. There were also more subtle, self-imposed constraints; for instance, where team members interpreted guidance and suggestions from leadership as directives.
-
Indirect vs. direct communication: Given the ambiguous nature of experiment streams, there was bound to be confusion. Lack of clear communication within the team didn’t help. For instance, sometimes leads made decisions, but some engineers didn’t understand how they arrived at those decisions. This was exacerbated when team members did not proactively ask questions, resulting in filtering of messages via personal biases, and interpreting questions about the project as a sign of missed expectations.
-
Decisiveness vs. inclusiveness: In this new model, the teams worked hard to have everyone’s voices and questions heard because all members needed to understand what the team was trying to achieve and why. However, this sometimes led teams in circles because it was difficult for all members to reach perfect clarity. They were also inclined to make decisions by consensus but found that it was nearly impossible and took too long – round after round of voting was the antithesis of biasing toward action. Despite existing guidance that Product had final decision-making authority on “what we build”, Design on “how it behaves” and Eng on “how we build it”, the shift from dictated execution to egalitarianism stalled in indecisiveness.
-
Hack it vs. doing it right: When trying to find product-market fit, speed and experimentation are essential, so at times it’s necessary to cut corners. This was challenging for some because their training had focused on thorough, end-to-end, scalable systems. Implicitly, we were asking EPD to ignore the best practices they had learned. Despite the discomfort this created, some team members were game to try something new, while others suffered through it: “Right now we are using Google Sheets as a DB; it’s the sort of thing that one has to do. I can’t tell anyone in my professional network what I’m doing. I can’t put that on my resume, ‘I use Google Sheets.’”
-
Quantitative vs. qualitative: Teams tended to rely primarily on quantitative data, which meant that the experiments were limited by two-week release cycles and the resulting inbound traffic (at small scales, it was ambiguous what conclusions could be drawn). Their unease with qualitative data (such as user interviews) due to the small sample sizes and lack of familiarity resulted in teams missing out on the motivational impact that direct user feedback could bring. “Even if we ran a test where the UI sucks, it’s still inspirational because it generates conversations and ideas. I watched the team go from feeling inspired to feeling drained.”
-
Empowerment vs. abandonment: Imagine trying to teach someone how to ride a bike. What would you do? Do you show them the bike and ask them to just give it a go? Do you grip tightly onto the handlebars while they’re attempting to pedal and you both try to steer? Do you let go and watch them fall? Do you ask them to get off and watch you so that they learn? Would they learn? Empowerment and autonomy are the two tenets of experiment streams. Fora leadership was committed to these two principles and deliberately shied away from being prescriptive. They struggled between wanting to help and worrying that the style of helping would reinforce the status quo: teams being overly reactive and reliant on frequent direction. The nudge-and-hint approach did not work, leading some team members to feel abandoned; the teams needed more attention and frequent course corrections.
-
Guidance vs. prescription: The danger of being prescriptive was very real, even if it might have been unintended. Depending on who was speaking and who was listening, suggestions and questions were interpreted differently, given the prior top-down experiences of many team members. “The team has a flat structure where people can voice opinions. I feel like when you put together people who are used to hierarchy, they don’t have the understanding to make a decision. They won’t make a decision. No one was willing to make a decision. We would go back in circles. I feel like people are afraid of failing; I try to tell people it’s okay, that it will tell us what not to do. People still don’t understand that until now. A lot of team members want direction and someone telling them what to do and how to move forward.”
-
Execution vs. education: Experiment streams attempted to replicate the startup environment, but many team members were unfamiliar with that style of working. It was hard for team members to learn that process while also trying to find produc-market fit. It was a dilemma, but leadership was willing to make the investment because they wanted to change the product development process and company culture.
Given the challenging nature of experiment streams, is it worth iterating on? It would have been tempting to scrap an imperfect process based on how difficult the experience was for team members. The outcome of the experiment streams was that the teams averaged 6 learning cycles in 4 months, resulting in a 15x increase compared to the previous rate of one learning cycle every 10 months. One of the experiments also resulted in a double-digit improvement to a core metric. Several UX improvements and product features that started as tiny experiments have since been scaled to our 1000+ forums, after more thorough testing and feedback from community members. Although there is much that we could do to improve this process, we experienced many wins and also learned from the tremendous challenges that team members had to work through.
Based on the feedback from team members, the major takeaways from the experiment stream process were:
-
Hold design thinking workshops to train future members of experimental streams so that they’re familiar with the philosophy and pace of the work.
-
Provide team members with a “Getting Started Guide” to set clear expectations upfront regarding time commitment and responsibilities, decision-making, and scheduling.
-
Start with a smaller team with 1 PM, 1 designer, and 2 engineers who are comfortable with ambiguity to figure out product-market fit. Give them time to do an in-depth exploration of the market and user needs to lay the groundwork, and add more team members on an as-needed basis.
-
If possible, provide teams with a dedicated coach to guide them through the process.
-
Invest in communications training, given the intense nature of the work and the disagreements that will inevitably arise.
Experiment streams are intense experiences given the timebox, and are now reserved for our most risky and ambitious bets. To make product development sustainable in the long-term for team health, the learnings from the experiment streams (hypothesis-driven experiments and rapid iterations based on user feedback) now form the basis for Fora’s product development process. The interpersonal challenges that arose gave us insight into team dynamics that should be addressed proactively and explicitly. This learning led us to other experiments in building our culture. We are all aware that failure precedes success, but the reality of how painful it feels can only be learned through doing. By participating in experiment streams, more team members now have the muscle memory required to more effectively approach new product development, which benefits the company as a whole.
Appendix: Feedback from team members
Benefits
-
I have been given trust and an opportunity to think freely and solve problems. At this point in my career, it’s exciting because I get to think, not just do. I enjoy the time that I get to spend thinking about things that are conceptual and that feed into strategic intent.
-
We release things fast and come up with new ideas, test them, deploy, and get feedback fast; my experience is that it’s not easy.
-
I love working with this team.
-
This phase, it’s kind of vague. What are we trying to test? How can we test that? The insight that we have, we thought it would take 3 weeks, but it took a day. I love the first tangible thing and people expressing the thing they most want; it’s inspirational.
-
The possibility to start experimenting with some ideas and implement them into a product and to become a new (or part of the) product; creating from scratch.
-
I got to work with some new people that I hadn’t worked with before. I don’t think we work closely enough and cross-functionally.
-
Having less of an engineering focus, user empathy focus, research aspect/talking about user behaviors, collaboration as a team, the idea of having the war room, and figuring stuff out together.
-
I never knew about this whole “fake it” thing – it’s cool. It is awesome to see the data from just faking it.
-
Failing fast is completely new. To push on and just keep going even if it’s failing, to keep wasting money even if we knew it was going to fail – that’s terrifying, spending time on things that were absolute failures.
-
Everyone having a voice is my favorite. Everyone on every team can make a mockup that can inspire everyone else and spark ideas and build off of ideas and steal them – that’s awesome.
-
That it’s new – I don’t want to deny the significance of changing our mode of thinking. It’s exciting to think differently.
-
I like the tension that happens in the team, it’s uncomfortable but you grow from that. It would feel boring if it wasn’t there.
-
There are more engineers involved on the product side and that always makes me happy. Some ask questions that relate to users and I love seeing and engaging with them.
-
It’s been cool trying to build new features. It’s cool to help develop the platform rather than just use plugins – trying to better the software we use. It’s in dire need.
-
The types of ideas that we have are not overtly trying to make a feature that will sell something to a member that will degrade the member experience.
-
Team members get a better grasp on how forums work rather than from an Eng/Design perspective. It’s cool to see everyone having those lightbulb moments from CM’s standpoint.
-
We’re getting more creative at this point.
-
The ideology behind the experiment streams – think of something that might be useful, pick something, and go and do it.
-
How helpful everyone has been, how much they want to take part in it, support has been there, top-notch from the beginning.
-
As a team, we are taking control of “the how” and what’s next for our product and the company.
-
I like the team members. It feels undefined and fun and I like the chaos and I like it when things speed up and everyone is excited.
-
The learnings that I get; to try to think from the Product perspective more than I was before. In this role, I could try making decisions – it allows me to learn how people make decisions. All in all, I like the experience; it’s something that I feel good about. The learning portion is my #1 purpose.
-
I enjoyed writing release tools. I like trying to stretch and think about forums and information in a different way: “What would make sense to me as a user?” I don’t usually think too hard about that.
-
It’s new. I love that no one knew what was going on. On previous teams, I didn’t feel like I could add much because there were people with more experience. Here, no one knows so I’m fine with contributing.
-
In theory, we have a lot of options as long as we can justify them. Experiment streams have broad, overarching objectives and they’re not limited in scope.
(Product) Engineering
-
I learned that I always tried to follow the best even when experimenting, but sometimes the best isn’t pixel-perfect UI – need to move faster, not perfect.
-
We need to be able to deploy on demand. We would have developed more fully-featured experiments (deeper than shallow) if we knew we could.
-
We launched 5 experiments in 4 weeks – that’s not normal, it’s exceptional. There are criticisms about experiments, but we learned at a much quicker rate than the default process.
-
It’s really hard for people to break out of the routine, from processes to who comes up with what. For Eng, it’s hard to get out of the way we normally build things even as we’re talking about experimenting now.
-
I yearn for doing something crazy, let’s build this thing totally out of Preact, like a hackathon.
-
Some team members are too hesitant to try things, getting too caught up in the weeds of strategy.
-
Having less of an engineering focus, user empathy focus, research aspect / talking about user behaviors, collaboration as a team about it, the idea of having the war room, and figuring stuff out together.
-
Tell everyone what the expectations are or that it’s okay to make mistakes – we won’t blow up our platform. If this came top-down people would be more inclined to make mistakes.
-
The mental model is changing, more agile, and more rigorous, but there are shortcuts.
-
There are not that many companies that would allow this kind of experimentation. I was excited to jump in and try. I’m not sure if my expectations will be met, but I learned a lot of things. If it turns out great, then it will be helpful for my career in the future.
-
The hardest for me is to break out from the mentality of the best way to do it, to go fast and not break production. I’m learning to be more flexible, not rigid, and that could be pulling back from trying to boil the ocean.
-
Engineering did brainstorming and we always talked about the pros and cons, but we never really thought about new features and why we built this. We just didn’t know. There was no market research, just the say-so of an executive.
-
The PM is straightforward and is more blunt about what is the right way. What I was suggesting didn’t make sense and he opened my eyes. If we get on the same page and understand the person’s perspective, it’s more beneficial.
-
There was miscommunication about what needed to be built and we wasted time building. Normally, I would snap, but now, let’s hear you more and see what you’re working for. I get it, let’s make a compromise that makes sense and we were able to crunch and get it done.
-
Some part of me wants to leave something in the production code that is our blood and sweat, but if it’s all failure nothing will stick. I understand we may not get there; R&D at big companies could be failures that you never hear of.
-
I felt zero prepared and was not sure what was expected of me, but I was excited. We had issues with my previous team and buy-in and was curious how you guys do it. I wanted to experience it – it may be a big failure for all I care, but I’ll learn.
-
I’m willing to learn. If I have to touch up on React or data pipelines, whatever is being used. Some on the team are not comfortable picking up tickets, but they’re part of the stream, they could learn, but it doesn’t interest them.
-
People are working overtime, but at some point, it shouldn’t be, work-life balance needs to be there. I can’t do it all the time. For engineering, it’s up and down. Sometimes it’s overtime and sometimes there are a lot of breaks. Product and Design have been working more constantly; sometimes Design is more bombarded.
-
It’s frustrating trying to keep people happy.
-
It’s hard to get into a rhythm and know when it’s high intensity and when it can be in flow any given day or week – it’s hard to mentally prepare.
Engineering
-
I don’t mind that sometimes we’ll fail. I am scared that we’ll learn the wrong lessons from the failures because failure is good if we learn the correct lessons – I’m not sure if we’re learning the right ones.
-
When I was working on our platform, I just added features here and there, but I never thought about whether we could restructure/scrap it and rebuild it. There were a lot of things I never got a chance to think about.
-
We’re so used to planning for a few weeks to work on stuff, work, and then deploy; now we’re jumping here and there. Sometimes I’m kind of scared but it’s also exciting to be like that.
-
I feel like there are more engineers than for the work needed.
-
I want to do more engineering work. I’m not used to the product hat. During brainstorming, I feel like everyone’s ideas are cool, but I don’t have many. I can contribute more to voting or more doable things rather than coming up with ideas.
-
I miss coding a lot.
-
It seems that everyone wants to prove that there are a lot of things that we’ve done, but the why, it’s not there. We’re too eager to get things done, but we don’t know why or how experiment streams should be done; the values are missing.
-
The team moves very fast, but I'm not sure that moving fast alone is good; we have to move fast and in the correct direction.
-
We were focused on doing stuff: what to do, how to do it, we lost sight of why.
-
The fact that we don’t take risks isn’t the problem. The problem is valuing productivity / getting things done. Experiment streams aren’t about productivity. We’re looking too much at quantitative feedback rather than qualitative. If you focus on getting things done, you focus on the easy path and so you don’t take risks.
-
We weren’t used to discovery as work. When we were asked to do user research, Eng was confused and not sure how to go about it.
-
I joined because the experiment streams sound cool and interesting. The other compelling thing is that I wanted a closer connection to the end user. I felt it was lacking in the previous roles on other teams: “How do I know what impact I have?”
-
I’m slightly confused about the roles, the distinction of roles, and what they should be doing between PM and Design. Roles are starting to distinguish themselves, but in the beginning, PM/Design seemed to blend.
-
I like the clarity of roles, it leads to less friction. Among engineers, there are strong personalities which makes it hard to collaborate, and having fuzzy roles just makes things worse.
-
Keeping up with updates from PM/Design, it’s overwhelming. There’s too much noise and too much info. For us Eng, we need to sit down to think and dive deeply; Eng prioritizes depth.
-
I’m longing for deeper engineering work.
-
I want to fast forward to the future where we have sufficient experience, where we have this process nailed – we would have a process framework laid out with distinct stages with distinct outputs and have templates for stuff figured out. Nothing would be a surprise, everyone would be confident in the process and what they’re supposed to be doing in the process.
-
Expectations were not clear. It wasn’t until about 3 weeks in that we knew what was going on. I was quite lost and a lot of people were, but when we started into “How might we”, it started getting clearer what our objectives were and what we were trying to get to.
-
I had no idea what was going on. I thought we already had ideas to implement. I didn’t think we’d be involved from the beginning. It’s very fun and exciting to see the process unfold. It’s not my strongest suit and a bit boring at the same time.
-
I wish we could experiment with all the ideas, but we can’t, it would take forever.
-
There’s not enough data to make properly informed decisions. We need to enable some data collection to figure out anything (i.e. what are users searching for). We need to know the behavior of all users across sites to give us more info.
-
The design thinking exercise was amazing, but people need more structure so they know what to expect.
-
There’s also a lot of free time, the meetings were staggered. I didn’t do much work outside the streams and not much happened inside of the stream. Lots of downtime was not helpful, but if I knew what I could do, that would be helpful.
-
I don’t like uncertainty, the uncertainty of interpreting results is super stressful. We have 3 different theories for the numbers and we don’t know why.
-
It’s hard to feel like my skills are being used well. It’s hard to find a niche.
-
It felt like the core work that needed to get done faster; I’m not equipped to do that. I feel like I could do a lot more.
-
People still think it’s wrong to fail. I’m okay with making mistakes, but getting everyone on the same page is causing hurdles.
Product Management
-
I don’t know how to coordinate and communicate with such a large team.
-
I don’t like telling people what to do.
-
I don’t know how to communicate this picture in my head externally.
-
I feel stressed out and guilty about engineers not feeling fulfilled.
-
Project management is not my strength and not something I want to do.
-
I feel imposter syndrome.
-
I just want to build something cool and make an impact.
-
Oh, this is what PMs are supposed to do (creating a strategy around how to tackle the “How might we”, corralling the team around it, figuring out what we want to do, and proposing a map to senior leadership).
-
It’s frustrating getting engineers to move forward and assist when the participation isn’t there (Slack or meetings). It’s hard to engage.
-
Frustrating to have to manage feelings (i.e., Eng saying Design is doing nothing).
-
I’m pretty excited. I’m getting to do something completely new. I felt lucky and nervous as well because it’s so new. It seemed like no one on the team knew what was going on and that turned out to be true.
-
The slowness of the team and reliance on team members to make progress is frustrating where most of my best contributions were siloed, working on things even when I was told not to.
-
I’m learning how difficult it can be to keep things going at a good pace when there’s so much uncertainty, and figuring out how to get the team unstuck. When there are disagreements about small processes, being able to identify when that’s happening and moving people away from it.
-
I feel conflicted about the speed. Maybe it’s my feeling anxious that we’re not doing enough. It looks slow, but it’s faster than before.
-
Everyone understands, in theory, that we should be doing something different, but no one understands how we can make that happen.
Design
-
Right now, I’m unable to keep up with the PM(s) and engineers.
-
There are different styles of engineering that I need to be more thoughtful of – the ones I thought wouldn’t like the thinking, do.
-
It’s important to get something tangible to get people to push.
-
I imagined working PM/Eng, pairing, working together extremely closely – that’s not the case and it’s feeling less so now.
-
There are more engineers involved on the product side and that always makes me happy, Some ask questions that relate to users and I love seeing and engaging with them.
-
So many engineers – I feel rushed. It’s a lot of work on research and design. I focused on design and working with engineers and sometimes did a half-assed job – I would like to spend more time.
-
It’s not super clear. I sometimes sense that there’s a fine line between assigning blame and holding people accountable. I feel like I've been blamed for certain things – nothing significant, just the team not understanding the context.
-
I don’t mind the check-ins but an end time contradicts the project.
-
What I imagined the experiment streams to be and what it’s turning out to be is different – it lacks the depth that I’m looking for (user research).
-
Direction changed back and forth. It’s weird and ambiguous and I don’t like to create work to keep people busy.
Data/Community Management
-
In the stream, everyone was confused. There was a lack of understanding, people missed meetings, didn’t pay attention, and asked people to repeat things.
-
I didn’t have expectations of being super hands-on. It’s changed; I’m super hands-on now. I’ve been in product meetings before but never built.
-
If we do another stream, I’d be happy to join again.
-
I never knew about this whole “fake it” thing – it’s cool. It’s awesome to see the data from just faking it.
-
Failing fast is completely new. To push on and just keep going even if it’s failing, to keep wasting money even if we knew it was going to fail – that’s terrifying, spending time on things that were absolute failures.
-
Lack of time to focus on just the experiment stream. There’s a lot of context-switching (I have other responsibilities).
-
I had zero expectations. I had no idea what I was getting into. I was so clueless. I didn’t expect to be part of the core team at all. I read the docs but couldn’t fully grasp what experiment streams are and expectations of me specifically. I only consult, but I’m very much in it now.
-
I’m learning about engineering and design processes, more about brainstorming and design sessions (there’s not a ton of creativity in my discipline), and different perspectives.
-
I like the diversity on the team; it’s nice to see. I’m usually the “only of” on a call, but here I see different faces.
-
It feels like we don't have enough time/data to make a decision. I’m used to more long-term trials; 7 days is not enough.
-
I need to do a better job of focusing on X experience, not just site-wide, but focusing on a smaller segment of users.
-
Team members lack understanding of the product (communities) and how it operates, forums in general.
-
Lack of wanting to just push forward – we don’t need to rehash this all the time.
-
A lot of time is wasted by people saying “I’m confused” or “I don’t understand” and the rounds of voting and repeating explanations; 5 meetings until someone takes control to move it forward. It’s really surprising how often things were derailed that needed explanations on basic things.