Transport Marketplace Counteroffer Workflow

Indigo Ag is an AgTech startup with a diverse set of logistics offerings for farmers. Our small engineering team produced an MVP of a feature that allowed parties in a two-sided grain transportation marketplace to send each other offers. After the MVP was released, I was asked to iterate on the offer feature design in order to drive up value for our users.

Scroll Down

The Challenge

User research and feedback from sales calls suggested that counteroffers and offer responses were important to our users. However, these features were deemed out of scope for the initial MVP. The KPI guiding both the MVP and the subsequent iterations was the number of accepted offers on the marketplace platform. Part of the challenge was understanding how to drive up offer acceptances in a lean and iterative way without building the entirety of a fully featured dual sided marketplace.

My team (composed of me and another designer) had limited design and research resources to arrive at a strong decision. We employed a number of techniques including: validating ideas against existing research, lean concept testing, and usability testing of wireframes/proofs of concept, ultimately arriving at a satisfactory iteration that has outperformed the MVP post-launch.

My design counterpart and I began with an intentionally open scope in order to encourage divergent thinking. We began by brainstorming potential nuances, solutions, research questions and sequencing thoughts.

We aligned on our desired scope for both sides of the marketplace to ensure that our two teams would be able to deliver a seamless end-to-end experience, including shared timing expectations, notification preferences, etc. We then identified a few questions that we felt that our PM partners would be better suited to answer (legal considerations, platform terms, etc.), as well as some that we felt we could de-risk with feedback from our internal experts.

After receiving initial answers from our PMs, we used our initial scope considerations to create a small prototype to use as a conversation aid with our internal experts (this prototype was later updated and used successfully as a sales enablement aid).

We performed exploratory research with our internal experts, focusing mostly on user expectations for key decision points (# of rounds of counteroffers, expiration timing, etc.). This feedback highlighted several key areas of the experience for us to improve on. Using this feedback, we explored multiple potential solutions to a particular edge case (as shown in the example below).

This project was particularly complex for a few reasons:

It involved both sides of our marketplace, which required tight coordination between teams, since every decision had constraints and repercussions on both sides.

It expanded our current data model of the concept of an "offer," requiring clear communication to our engineering counterparts.

It intersected with other in-flight work regarding order editing and flexibility, so any potential solutions had to be evaluated against that project's constraints as well.

At the end we arrived at a tightly scoped solution that included a single round of negotiations, the ability to agree to partial quantities of orders and clear expiration timers. We felt strongly that these initial bets would allow us the flexibility to continue to build iteratively on the dynamics of the marketplace. Initial reactions post release have been widely positive, and our KPI of"the amount of requests matched on our platform" increased significantly in the weeks following launch.