April 10, 2019

Developing AIESEC’s Mobile App in 6 months.

In a bottom-up process, our organization had determined that it is beneficial to create a product of set functionalities that would engage our users after they had purchased our service.

AIESEC is a global non-profit run by youth providing leadership opportunities to young people through exchange experiences.At that time our digital ecosystem consisted of our platform at aiesec.org, which facilitates the sign-up and on-boarding process and allows our customers to find & apply to opportunities and track their applications’ statuses.In a bottom-up process, our organization had determined that it is beneficial to create a product of set functionalities that would engage our users after they had purchased ourservice.

I knew I had to level up to ensure the success of the project and started reading about agile methodologies and the UX lifecycle. Me and my project team, we ended up utilizing the following concepts:

https://www.gv.com/sprint/

Since we only had very limited resources (e.g. only 2 front-end developers, sharing design & backend with another team), I knew that we had nothing to waste — especially not time. Everything we were doing was in order to deliver user-validated increments as fast as possible. Sometimes we did not hit our goal of releasing an increment on time, as backend’s priorities were not dictated by us and outside of ourbacklog.

We opted to go for a hybrid app framework, as our developers had expertise and hence we deemed this as the approach to go to market faster. However, some product requirements could have been implemented better in a native app.

Design Sprint

A key constraint was time and the fact that our development team was located in a different time-zone. To be able to conduct design sprints, we decided to split up the process, putting Monday-Wednesday in one 3 hour meeting and ensuring that the agenda objectives of Thursday and Friday get achieved during the actual week.

Since it was important for us to validate the idea behind each of the features and if it would actually solve our customers’ problems, we needed to run as many customer interviews as possible. I got access to a group of customer support specialists in various regions and markets, who would sit down with our customers in their respective offices and run their own set of interviews. This allowed us to gain validation through interviews in different customer segments.

During the project, I was working closely together with Marketing, aligning in weekly touch points to ensure they were continuously on-boarding users to newly released features. However, I failed to include Operations, so our on-ground volunteers had little ability to direct our customers to the product to solve their problems.

What went well 
  • This process allowed everyone to work at their own pace.
  • We were able to keep all elements of the design sprint.
  • Storyboard had to focus on key functionality, preventing feature creep.
  • Scalable user feedback.
What went wrong ❌
  • Designer constantly felt rushed having to make tradeoffs, omitting what might have been good ideas.
  • Feedback to design was seldom implemented since the next design sprint would already start the next working day.
  • It was hard to keep customer support specialists accountable to this extra role and they did not always deliver
Our agenda for a 3 hr Design Sprint

Customer Insights

Prior to even starting with producing anything in the project, it was up to me to validate the overall purpose of the project. We employed various product analysis methods:

Since AIESEC operates in 100+ different national markets, we knew any insight I would generate would be very generalized and subject to high variances. Hence, I recruited and trained 14 customer support specialists and trained them to do the same analysis that I was doing then. This allowed me to document my process and learnings on the fly and directly communicate them to my team. These learnings were downscaled through detailed guidelines and weekly webinars.

Data analysis

For each customer, we are tracking which service standards they are receiving. We correlated this with the NPS these customers were giving us and which of the service standards they were most and least satisfied with. This gave us a good idea which areas of our service we should and should not focus on to build solutions through this digital mobileproduct.

Top Detractor Issues

Qualitative interviews

We were able to use these insights to poke into the right direction when we invited customers who had recently come back from an exchange experience and were doing explorative qualitative customer interviews. This allowed us to figure out the reasons these particular service standards were delivered unsatisfactory or in some cases even not at all.

  • What was the hardest part about going on exchange from [home country] to [host country]?
  • Tell me about the first time [problem] happened?
  • Why was that hard?
  • What, if anything, have you done to solve that problem?
  • What don’t you love about the solutions you’ve tried?
  • What specifically could AIESEC have done to support you better?
Quantitative surveys

We created a short sentence describing each of these reasons, created a survey and sent it out to a large set of our customers, asking them to rate the likelihood of each of these problems occurring during their experience. Each of my customer support specialists analyzed their own subset of data and used this to identify recommendations for their market and gave them to me. We prioritized the problem sets and started tackling them in our design sprints to be solved each with their own MVP solution.

What went well 
  • Creating one product fit for different markets.
  • Teaching customer support specialists to do product analysis proved to be fairly easy by providing clear instructions, trainings, and templates.
Want went wrong ❌
  • When priorities for multiple markets conflicted, we had to accept that the final product might not be able to solve a particular problem in all markets.
  • Setting accountability with virtual customer support specialists, who did not work on this project as their primary role.

Agile Development

It was my responsibility to break the user requirements down into user stories based on the design, the goal of the product and the customer insights gathered. The development team then worked on implementing the product increments. One key constraint was that we were only given 6 months to complete the product, so we would be ready to launch at our yearly executive gathering.

After planning out the project timeline, we knew we only had a certain number of sprints available to deliver our product. We estimated we would be able to tackle 7 problem sets by time-boxing the development for each product increment into two weeks sprints.

What went well 
  • Time-boxing ensured the product would be delivered by the end of the 6 months.
  • Continuously releasing usable increments of the product allowed to continuously gather insights from our users.
  • The development team had a strong understanding of requirements and allowed us to deliver a complex product fast.
What went wrong ❌
  • Features had to be scoped down during the sprint, missing key functionalities.
  • No time to implement feedbacks gathered from users who were actively using the product in its beta phase.
  • No time planned for integrations with existing systems, creating a product which is detached from the organization’s digital ecosystem.

Final outcome:

More Info
What went well 
  • Going to market fast to quickly validate your assumptions
  • Minimizing risk through the low amount of resources invested.
  • 50% adoption rate of customers after the initial release.
What went wrong ❌
  • Focused only on synergy with Marketing, resulting in high adoption rate, but low engagement, as our volunteers on the ground were not aware of the new product to engage customers to keep using it.
  • Implementing frameworks such as Scrum would have allowed us to create releasable increments after every sprint.
  • Taking more time to validate the architecture used could have resulted in a better product.

Developing AIESEC’s Mobile App in 6 months. was originally published in Laurins Page on Medium, where people are continuing the conversation by highlighting and responding to this story.

other posts