Back
Goal-oriented Mindset
  
  
Narrow Down The Scope
Microsoft team informed us that the previous generation of FarmGazer had significant limitations: the camera functionality was basic, capable only of taking photos, with a limited range and low resolution, making accurate analysis impossible. Our task was to optimize and expand the product in the areas of plant disease and pest management to make it smarter.
  
After evaluation, we identified that the structural design of the previous version constrained its capabilities, particularly in that the camera could only capture long-distance content. To achieve the required precision, we needed to redesign the product's structure.
  
  
Jobs to Be Done
Primary: Accurately detect pest attacks and reduce production losses.

Secondary: Help farmers monitor their fields more efficiently, saving them time and money.
*Extra Requirement: The solution had to be scalable for widespread future adoption, even in less developed areas.
Strategic Mindset
  
  
ML or CV ?
To achieve this goal, we explored two potential solutions: machine learning (ML) and computer vision (CV). After evaluating implementation complexity, scalability, and cost-effectiveness, we opted to use CV as the foundation to drive future ML advancements.
  
VS
  
Considering our team’s expertise, development timeline, and budget constraints, we decided on a solution that combines a Sharp sensor with a motorized camera.
  
  
  
For the user experience (UX) design, based on the target user demographics—such as age and skill level—as well as team size and project timeline, we developed two prototypes: Plan A and Plan B. We conducted independent usability tests with the Microsoft team members, and the results indicated that Plan B had higher user satisfaction, achieving an 80% approval rate.
  
  
Field research showed farmers mainly need pest alerts and action steps, so we moved detailed data to a separate page, leaving only key info and calls-to-action on the Dashboard.

Pros:
1. Reduces cognitive load
2. Streamlines workflow (no need to sifting through data)
3. Improves usability for all users

Cons:
1. An extra click to get in-depth data
Break-down Mindset
  
  
Field Testing
After completing the second-generation demo of our enclosure design and coding, we eagerly conducted our first field test.
The field test revealed significant issues across multiple aspects: the enclosure, the algorithm, and the user experience (UX).

▶︎ In real-world conditions, the enclosure’s design fell short—moisture in the environment caused component damage and short circuits.

▶︎ The algorithm, meant to streamline pest detection, was easily thrown off by debris such as leaves and dead insects, leading to an incessant stream of false warnings. Instead of easing the user’s workload, the front-end app overwhelmed them with non-stop notifications. Worse, the detection system misclassified some images, leaving users with no way to correct the errors—an oversight that left them frustrated and disheartened.
Iterations
Rather than seeing this as a setback, we chose to break the problem into manageable pieces and tackle each one systematically.

▶︎ For the enclosure, we redesigned its structure to completely isolate the components from the external environment, making the device more robust, stable, and easier to maintain.

▶︎ On the algorithm side, we delved into ways to filter out debris interference, ensuring it would only focus on relevant data.

▶︎ For the UX, we reimagined the warning system. We introduced features to let users adjust the frequency of warnings and manually correct detection errors. To inspire confidence, the system would also periodically send “all-clear” messages when no issues were detected.
  
  
  
Each member of our team took ownership of one problem area. We dove deep into research, learning everything we could about potential solutions. Every week, we regrouped to consolidate our findings: one team member would propose a solution, and the rest of us would critically evaluate and refine it together. This collaborative process became the turning point for the project.

Through persistence and iteration, we developed a Binary Grayscale Thresholding system that dramatically reduced the false warning rate. Our solution was not only a breakthrough for the product but also worthy of recognition—we published our findings in IoTaiS, marking a proud moment for our team.
  
  
  
Learning Mindset
  
  
Into New Territories
Connecting the front-end to the back-end database turned out to be a daunting challenge for our team of designers—it was completely outside our expertise. We soon realized that the front-end code couldn’t communicate directly with the database.

After some research, we discovered that we needed an intermediary Flask server to bridge the gap. For a team already pressed for time, this seemed like an insurmountable obstacle.
  
My Own SOP of Leaning
I decided to step up and take on the challenge of building the Flask server myself. I knew it wouldn’t be easy, but I was determined to find a way. Here’s how I approached it:

1. Seek guidance from people that are closest to my goal
I reached out to two instructors who taught front-end development in UW. Their experience helped me understand industry best practices. With their insights, I planned a solution that aligned with our limited resources.

2. Break the problem into smaller pieces
I analyzed the specific features our front-end required and translated those needs into a clear set of capabilities the Flask server needed to deliver.

3. Learn and iterate
With no prior experience, I turned to online tutorials and resources, leveraging tools like ChatGPT to quickly grasp new concepts. I worked tirelessly, experimenting, debugging, and refining until I got it right.

After two intense weeks of learning and trial-and-error, I successfully built the Flask server, enabling smooth communication between the front-end and the database. This breakthrough kept the project on track and ensured we met our deadlines.